Sunday, December 28, 2014

Login failures and the joy of Linux

Screenshot of Xubuntu 9.04's login screen
Linux login screen (Photo credit: Wikipedia)
I have only Linux running at home, most of them Mageia. Which means that I am also Technical Support. A few days ago, the kids complained that they couldn't log in on the shared computer near the kitchen..I tried logging in and after I entered my password, a dialog box appeared and said "The name org.gnome.DisplayManager was not provided by any .service files". Clicking on OK would land me back in the login page. Fortunately, I solved it pretty quickly.
I found out that there was a power outage and the machine restarted with a filesystem error. It fixed itself but then the error message came out. I reckon one of the config files got mangled and needed to be re-installed. If you are new to Linux, this is not as bad as it sounds. This isn't the only Linux box in the house, so I had options. My first guess was that the Mate / Gnome config file in my directory was messed up. So I logged in as root. It logged me in without any error.
A quick Google search did said that it was likely my GDM custom.conf file was mangled. I compared it with my laptop's version and it was the same. Then I remembered that Mageia didn't use GDM but LightDM instead. And then I realized that all I had to do was switch Display Managers. Mageia came with about 4, so I was spoilt for choice. I opened up the Mageia Control Center and then chose Boot and then Display Manager. I chose GDM, saved and logged out. Problem solved.
Sorta. I will have to get around to fixing LightDM but there is no rush. GDM is almost similar and Mageia developers went to great pains to ensure all the graphics were consistent. So the difference my kids saw was that instead of a drop-down list with their names to choose from, their names were now in a dialog box list. It was something they saw for about 3 seconds and knew immediately what to do.
This is one of the reasons I love using and working with Linux. It not only gives you choices, those choices are modular to the point where one breaks down, chose another that does the same thing and move on. This would have been a major catastrophe on MSWindows. I'd be looking at a re-installation at least. If I knew, what file was corrupted, I could replace it but I wouldn't know whether it would be of the same version of the other MSWindows components.

Tuesday, November 11, 2014

Beware: Blogger deletes everything in HTML editing if you don't save

Some words of warning when editing in HTML in Blogger and what you should do every single time to avoid disappointment.
I am livid. I was working on a post for hours when I decided to edit something in HTML view of the editor. I saved the post before switching to HTML view. After some tweaking, I decided to drop the changes and revert. I clicked the Close (post) button and it warned me that all changes will be lost. I was fine with that and said Ok. When I opened the post it was gone. Blogger decided that since I didn't want to save, it should save nothing. Literally nothing. Blogger saved an empty page.
Lost all the work for the past hours. Pressing the back key sometimes switches to the past state but not this time. Blogger was serious. Even though I had saved and closed and opened my post several times, it didn't changed the state of the page. I was working within one page as opposed moving from page to page. I wonder how many people done this and moved to wordpress or tumblr in disgust. Perhaps this is one small way Blogger is killing blogging.
The way to avoid this and have backups before editing a post in HTML is use the preview function. Preview will generate the preview of the page in another tab. Then switch back and edit in HTML. If things go south, you at least have the text in the preview tab.
Thanks Blogger, for nothing. 

Monday, October 27, 2014

I touched my laptop screen and I liked it

I finally decided that I needed a new laptop. My 2008 HP Mini was really showing it's age and I wanted to do some work with VMs that would tax my desktop. I did my homework and was content to buying low end laptop, hoping that Linux would be able to detect the 'standard' configuration without much fuss. Through a surprising turn of events, I ended up with a Lenovo IdeapadS410p Touch, a laptop with a touchscreen. It was an Intel i5 machine with 4GB of RAM (which I bumped up to 8GB), both VGA and HDMI outputs and a DVD drive to boot.
English: Touchscreen
Kids love a touchscreen (Photo credit: Wikipedia)
So how did it came about that way? I had done my homework and gone to buy the Lenovo laptop that didn't have an OS bundled or the 'DOS version' they called it. How many people buying new computers remember what the heck is DOS, is another question. But that range only came with AMD CPUs and having done that in the past (and got nothing other than a warm lap and mediocre performance), I decided to go for the Intel version, the i5 specifically. But to keep my options open, I decided to also keep an open mind on the the AMD A10 CPU which was by most reviewers as good as the i5 although meant to compete with the i7s.
Next was to find someone who knew what they were talking about. Too many times, I have been besieged by salespeople who knew little about what they were selling. It was time to give the right guy their due. I finally found a chap who gave me several options and let me try the laptops. Finally, I decided to ditch the A10 and went firm with the i5. He found me two models that fit the bill, a Windows 8 machine with a touchscreen and the OS-free version without a touchscreen.
For some reason, the non-touchscreen Lenovo laptop was slightly pricier and was a different model range. I did get the notion that the guy wanted to get rid of it because it was an older model. A quick check showed it was still listed as current on the Lenovo website, so I figured that it wasn't all that old. I figured I might as well see what the fuss was about Windows 8 and the touchscreen interface.

Monday, September 15, 2014

Going Minty 3 - Solving why Gimp is opening PDFs on Chromium

Something I did not encounter on Mageia but cropped up in Linux Mint is something quite strange. It's strange because it also seems counter intuitive. Especially for a distribution that does so well in keeping things user-friendly. The odd thing that happened to me in Linux Mint was that Chromium opens PDF with Gimp.
A screenshot of the GIMP 2.2.8 raster graphic ...
GIMP 2.2.8  graphic software. (Photo credit: Wikipedia)
Now this is not too bad if you have a good PC. And it's not wrong either because Gimp can open PDFs and better still, edit them. But you want to open a multi-page PDF, Gimp will render each page up-front. Meaning that if the PDFs has a lot of pages, it's gonna take some time. If your rig has less than 1 GB of RAM, the wait becomes worse.
The solution is obvious: change the default setting or program for opening PDFs. Unfortunately, that didn't work for me. Set what ever it is, the default is set to Gimp. I do get a choice to switch to another program each time, but it tends to get annoying. So how does one change the default application. Apparently there is common program called xdg  that helps with opening of files. Applications under freedesktop.org call on xdg to help them open document files. So for Chromimun, after it downloads a PDF file, it calls on xdg to open it. xdg determines the actual viewer and passes the name of the PDF to the viewer for it to open. The definition for the 'actual viewer' is either set by the underlying environment (KDE, GNOME, etc) or by xdg itself. The command is as follows:

Sunday, September 14, 2014

Away and back

There is no way else to say it. I haven't posted much in the past few months. Simply put, work overtook free time. In fact, work overtook everything else. So much so, I had to come to a decision, choose work or everything else.
Don't get me wrong. I loved working with the people I've been working with the last few months. They were, and still are, some of the smartest, most positive people I've worked with. Whatever came our way, we took on the problems and dealt with them the best way we could, with whatever we had. We played with the hand we were dealt with, no excuses. Inclusion was a big theme. Information was shared freely and bullshit was called out without shaming and without shame. Getting things done was the song of the day and it drowned everything else.
English: An artist's depiction of the rat race...
(Photo credit: Wikipedia)
But it came with heavy costs. Free Time fell first. Health came next. I'm sure Sanity would have been the next casualty. It's a big problem for me because I've seen how lives and families were lost when work took over everything. I could learn from lessons past or forge ahead.
So I made the decision. I value my life and my family more than work. Work is money but having gained hindsight the others, I've saved some just for a rainy day like this. Money can always be earned elsewhere. But love is life. And I love my life.
I have a ton of posts in draft so expect to see more in the next weeks. Thanks for sticking around.

Saturday, March 01, 2014

What Facebook saw in WhatsApp and Liked it enough to buy them

Sizing up WhatsApp and Twitter
Sizing up WhatsApp and Twitter (Photo credit: Tsahi Levent-Levi)
A lot of people are scratching their heads about the Facebook deal with WhatsApp. Most of those heads are in the US. They just can't see why Facebook would pay so much money to a company that charges a dollar a year to use it, with the first year for free. In fact, it seems that WhatsApp seems to be looking for ways to give itself away for free. In the early days, all you had to do to get another year for free was to uninstall and reinstall the app. In some countries, using WhatsApp doesn't count against the data cap.
So what is Facebook really buying? It's very simple: Facebook is buying users. The popularity of WhatsApp in the rest of the world is so huge that it dwarfs so-called popular messaging platform. But what makes it most interesting is how loyal users are to it. Rather than bore you with numbers, here are the 5 reasons it is so popular and why Facebook splurged serious cash for it.
It's cross-platform where it matters.
To a lot of people, especially on IOS,  WhatsApp was the way they communicated with their non-iPhone friends. It was also the app Blackberry users told their friends to install if they wanted to send messages to them ala BBM. Using WhatsApp allowed you to join your friends on BB and iPhones.
While messaging platforms in the past were also cross-platform, the platforms they covered were traditionally computer-centric. WhatsApp is all about mobile platforms, from IOS and Android to all the way to the common Symbian phones. Which makes it accessible to more people than PCs. For the younger generation, especially in the rest of the world, a smartphone is their first computer. Which is partly why there are so many active WhatsApp users.
It ties in with your phone number. This is the secret sauce. WhatsApp identifies you by your phone number. At first glance this may not be a big thing. But by making your phone number your unique ID, it ties you, the WhatsApp user, with a verified ID. Your phone company verified you as a paying customer, their definition of a "person". Different phone companies have different regulations for who can have a phone number. Each country has their laws regarding phone number ownership. WhatsApp rides on these laws and regulations to ensure that the phone number being registered to WhatsApp actually belongs to a person. This, plus the fact that users can only message to people in their phone book or to groups that they can leave any time, raises the bar of entry to bots and spammers. 
Plus having a globally unique ID like the phone number is a programmer's dream. They now have a way to follow you from phone to phone and keep you connected to your friends. Switch your handphone, even switch to another platform. all you have to do is insert the sim card, install WhatsApp and you start getting your messages and continue discussions in your WhatsApp groups. For those of us who can't figure out how to transfer contacts, this is really useful because your friends' names appear next to their phone numbers in the discussions. You can then add them back into your contacts in the new phone.

Monday, February 24, 2014

Recover from a bad superblock

When things go really bad, you may not be able to recover a disk. In those times, think of salvaging the data, reformat and live to fight another day. Consider how valuable the data is versus the time spent on repairing something that is damaged and may not be salvagable. testdisk photorec ddrescue are the tools to think of when you come that decision
But I do enjoy a challenge and when a USB disk was brought to me with mounting problems, I just couldn't pass it up. It was an uncommon setup. The USB stick had two partitions, one with an ext3 filesystem and the other with FAT32. I decided to focus on the ext3 filesystem first.
FSCK
FSCK (Photo credit: SFview)
To cut a long story short, my efforts to mount the disk met with screens full of error messages and cryptic clues as to what went wrong. Running fsck seemed to clean it first but it still would not mount the partition. Running fsck again would yield more and a different set of errors. My previous boss love to used the expression "time to decide: Fish or cut bait". It was one of those times.
This is probably the last ditch effect before you make that fateful decision. This is the line in the sand and the one you have to cross before deciding to put your effort in getting the data out and start all over again.
The recovery process involves rewriting the information about the partition. Specifically, reinitializing the superblock and group descriptors. However, reinitializing does not touch the data part of the partition. It does not touch things like the inodes and the blocks themselves. So by starting out with a 'fresh' set of information that is used to mount the disk, there is a possibility that the data may still be readable. After that, the data part gets checked and hopefully what you end up with is a filesystem that can be mounted properly.
The process can only be done when the partition is not mounted. If you have tried other ways, it most probably isn't. Mine wasn't, obviously.
So here's the process.
1. First, figure out the block size of the USB drive (in this case /dev/sdf1). I need that information to re-build the partition information. Run the command
dumpe2fs /dev/sdaf1 | grep 'block size' -i
Block size:               4096

2. Then format the superblocks. The command below won't format the whole partition, only the superblocks. It is critical that you use the correct block size gathered from the previous step
mke2fs -S -b 4096 -v /dev/sdf1

3. Now that the partition information is 'fresh', I checked the inodes to figure out what else could be wrong with the filesystem. Remember ext3 = ext2+journalling. So, ext2 tools still work
e2fsck -y -f -v -C 0 /dev/sdf1

4. Now that I've done with one element of the ext3 equation, it time to fix the journalling system or more specifically the journal data .
tune2fs -j /dev/sdf1

5. Re-attempt to mount the partition. If everything went well, you should be able to mount the partition and read the data.

After that, for hard disks, you have to determine whether the disk has reached it's threshold limits. Things like SMART properties will help you get that information.

Interested to know more: http://ubuntuforums.org/showthread.php?t=1681972&page=5&p=10434656#post10434656

Enhanced by Zemanta

Sunday, February 16, 2014

Is Ubuntu is licencing Linux? Canonical looking for value in the wrong places

Linux Mint 11
Linux Mint 11 (Photo credit: Wikipedia)
Full disclosure: I am no fan of Ubuntu. I applaud their efforts to put Linux in as many hands as possible with the free CD distribution effort but I'm of the opinion that Ubuntu puts itself above Linux while riding on the contribution of open source developers to Linux in general. I applaud their focus on making Linux user-friendly but I'm of the opinion that their effort is no more better than of other distro developers like Mandrake/Mandriva in the past. To top it off, I've predicted the path Ubuntu will take eventually once it has decides it does not need the community any more.
So it comes to no surprise the latest move by Ubuntu to protect 'it's intellectual property' is to licence Ubuntu. Sounds harsh? Some people will think I am being unfair using language normally used to describe Caldera. How else should I react when Canonical is asking derivative distros to sign a license to use 'Ubuntu binaries'? Ubuntu apologists have already made their stand known. They have made light of the gravity of the act of demand to licence and trying to convince us that the issue is about protecting the Ubuntu brand when it comes to derivative distros, Linux Mint, specifically.
I have ask: Why Linux Mint specifically? Does Canonical ask the same from Kubuntu and lubuntu? Is it because Linux Mint is becoming increasingly popular at the cost to Ubuntu? I've been thinking about writing of the possible danger of other distros basing their work on Ubuntu and how dangerous it is to base their work on a source that is actively consolidating their hold on it. I guess I don't have to now.
Really, I don't. At the end of this post are links to articles that go into this deeper.

Recently Popular