Entries Tagged as 'Linux'

gOS – Nothing but ‘Net

Strike two — another candidate falls by the way side.  Don’t worry, this isn’t baseball so I’m not feeling the pressure of striking out (just yet).

gOS is a descent Linux distribution, and it works — in fact it works nicely.

The applications that come bundles are heavily dependent on Google; and it really doesn’t follow the Mac look and feel very completely (you would have to scab on a better theme and my feeling is that way too advanced for the target audience). 

gOS is also missing most every multimedia capability that an average user would want.  That’s allegedly to avoid legal issues in many countries, but the fact of the matter is if you can’t play a DVD or most video and audio streams a person is likely to find it’s just not an acceptable OS for the general public.

If you want something fairly basic that works when you install it and doesn’t require much fussing, but you’re not interested in multimedia this might be a reasonable choice; but you’re probably better off to stick with a distribution that doesn’t carry the weight of Ubuntu (something derived directly from Debian or built from scratch).

gOS

Originally posted 2010-01-05 01:00:48.

Defragmenting

There are many people out there that say that *nix and Mac file systems don’t fragment — only Windows does.

They’re dead wrong.

[I know I’ve said this before, but it’s worth saying again]

All three file systems (and in Windows we’re talking about NTFS, not FAT) derive from the same basic file system organization, and all three have pretty much the same characteristics (there are differences, but those really have nothing to do with the likelihood of fragmentation).

Fragmentation is just a by-product of the way a file system works.  The file system must make decisions about how to lay files down on the disk, and since it doesn’t have a crystal ball it cannot see the future.  Thus is a file is pinned between two other files and it must grow, the file would either need to be moved (creating an empty spot of a maximum size) or extended in another area (thus being fragmented).

There are various schemes for handling file allocations, but most of them rely on an application that is creating the file giving the operating system (and thus file system) sufficient information on the files maximum size and hints as to whether it is temporary, may grow, etc.

Given that file systems will fragment, the need for defragmentation is real.  Windows recognizes this (mainly because Windows used to use a FAT file system where fragmentation caused severe performance issues).

If you have a *nix or Mac based system, I’m sure you can locate a reasonably good defragmenter (not every one is in denial about the need for periodically defragmenting the system).  If you have  Windows based system you already have a reasonably good defragmenter that came with the systems (a “lite” version of Executive Systems Diskeeper, which now just goes by the name of Diskeeper Corporation).  You can, of course, purchase a number of commercial products, like the full blown Diskeeper, O&O Defrag (my personal favorite), or download a host of free or inexpensive products.

The key to defragmenting your system is knowing when you should invest the time (and wear on your disks).  The most accurate answer would be when system fragmentation reaches a point where it adversely effects performance.  That seems a little vague, but most of the defragmentation tools actually will do an analysis and advise you if they should be run.  Some of them have active defragmentation (but like the file system, they don’t have a crystal ball, and will often cost performance, not enhance it — so I would just say no to active defragmentation).

A good rule of thumb is that right after you install you system, or any time you install major updates or service packs you should defragment your system.  It’s a good idea to clean off temporary files (like your browser cache, etc) before you defragment.  And you might even want to clean off old restore points (if you have them enabled).

There’s certainly no reason to defragment your system daily or weekly; but an occasional night of running your defragmenter of choice will likely decrease boot time and increase overall system performance.

One other little tid-bit — remove your paging file before defragmenting; then after you’re finished, create a new paging file of a fixed size (ie set the minimum and maximum to the same thing).  That way you have a nicely defragmented paging file that will not cause fragmentation or fragment itself (leading to better system performance).  Of course, if your system has enough memory to run without a paging file, you don’t need one at all.

Originally posted 2010-02-21 01:00:20.

Image and drawing programs

Most people don’t need a very sophisticated image editing or drawing program to meet their needs.

It’s simply insane that many people shell out the money for crappy products like Adobe PhotoShop, Adobe Illustrator, or Microsoft Visio for the work they need to do.

Simple image (digital photograph) editing can be done with a number of free software packages.  For many Google Picasa or Microsoft Live Photo Gallery will do everything that’s needed and allow for easy posting of images to a web site for others to view.

For people who want a little more power, and not to be so tightly wed what Google or Microsoft think you should do with your digital assets there are other good choices.

Paint dot Net for Windows is a good basic image editing program.  It will satisfy most of your digital image editing needs.  It does only run on Windows, so if you’re looking for something for your Mac (because you don’t like iPhoto) or something for Linux…

GIMP is a highly portable image editing program.  It isn’t basic, it’s sophisticated and can require a moderate learning curve (think Adobe PhotoShop).  There are versions of it available for most any Linux distribution, Windows, and OS-X.  It’s totally free, and the choice of many casual and professional users.

If your needs are more along the lines of diagramming, you could simply use the Draw component in OpenOffice.  Draw is plenty capable to do meet most of your diagramming needs.  However, if you want something with more capabilities…

Dia is intended to create structured drawings.  It has many of the capabilities of Visio and simple CAD type programs.  It’s absolutely free, and available for most Linux distributions, Windows, and OS-X.

Obviously there are cases where you will need to pay a licensing fee for software; but if you’re a home user I’m sure you have much better places to put your hard earned cash.

Also, if you do feel you must buy PhotoShop, make sure you allocate the time and money to take a course at your local community college — it’s not likely you’re going to become very proficient using it on your own.

Originally posted 2010-01-16 02:00:46.

Anti-Malware Programs

First, malware is a reality and no operating system is immune to it.

Malware is most common on operating systems that are prevalent (no reason to target 1% of the installed base now is there); so an obscure operating system is far less likely to be the target of malware.

Malware is most common on popular operating systems that generally do not require elevation of privileges to install (OS-X, *nix, Vista, and Server 2008 all require that a user elevate their privileges before installing software, even if they have rights to administer the machine).

The reality is that even a seasoned computer professional can be “tricked” into installing malware; and the only safe computer is a computer that’s disconnected from the rest the world and doesn’t have any way to get new software onto it (that would probably be a fairly useless computer).

Beyond exercising common sense, just not installing software you don’t need or are unsure of (remember, you can install and test software in a virtual machine using UNDO disks before you commit it to a real machine), and using a hardware “firewall” (residential gateway devices should be fine as long as you change the default password, disable WAN administration, and use WPA or WPA2 on your wireless network) between you and your high-speed internet connection; using anti-malware software is your best line of defense.

There are a lot of choices out there, but one of the best you’ll find is Avast! — there’s a free edition for non-commercial use, and of course several commercial version for workstations and servers.

My experience is that on all but the slowest computers Avast! performs well, and catches more malware than most any of the big-name commercial solutions.

For slower computers that you need mal-ware protection for, consider AVG (they also have a free version for non-commercial use); I don’t find it quite as good as Avast! at stopping as wide a range of threats, but it’s much lower on resource demands (and that helps to keep your legacy machine usable).

Originally posted 2009-01-02 12:00:01.

Linux BitTorrent Clients – Follow-Up

I’ve been using several Linux bit torrent clients fairly heavily for the past week or so, and I have a few new comments about each of the “contenders” — below I’ve ordered them as I would recommend using them.

KTorrent · KTorrent might be a little “fat”, but it works, and it works very well — particularly when dealing with a large number of torrents simultaneously.  This is my pick.

TorrentFlux · TorrentFlux is probably the best solution you’ll find for a torrent server.  Simply said, it works fine (though I don’t know that I’ll continue to use it, simply because it doesn’t seem to be being improved, and it’s far from perfection).

Transmission · Transmission is simple, and that simplicity seems to pay off — it works, it works well.

qBittorrent · qBittorrent works fairly well for a small number of simultaneous torrents; but if you want to download large numbers of torrents or seed large numbers of torrents stay away from this one — it actually crashes, and unless your goal is just to watch the integrity of your torrents be checked and over and over you can do much better.

Deluge · Deluge was what I really wanted to like; and it seemed to work, but it has two major problems — it doesn’t handle large numbers of torrents well, and it doesn’t properly handle port forwarding (either through UPnP / NAT-PMP or when you try and set the port forwarding manually).  We’ll just leave it at it has issues (that apparently are fairly well known) and the progress on it is glacial in it’s pace.

Moving torrents from one client to another isn’t all that hard to do, a little time consuming maybe… but once you figure out how to do it, and let your data files re-check, you’ll be on your way.

My experience over the past week reminds me that you can do your diligence by researching every fact and figure about a program all you like; but until you put it through the paces you just won’t know.

NOTES: My test included about 550 torrents totaling just under half a terabyte in total size.  I required that ports be forwarded through a firewall properly (either via UPnP, NAT-PMP, or by hand), and that I be able to control the total number of active torrents (preferably with control over uploads and downloads as well), and be able to restrict the bandwidth (a scheduler was a nice touch, but not a requirement).

Originally posted 2010-08-25 02:00:30.

VirtualBox LinuxDesktop RealPerformance

The other day I installed VirtualBox OSE on my Ubuntu machine so that I could migrate over a Windows Server 2003 machine.  I wasn’t really expecting great performance since I was putting the virtual disks on a single spindle…

Sometimes you get a good surprise.

When I started up the virtual instance, it seemed very fast — so I shut it down and started it again.  Then I performed a few quick tests and I realized that not only was VirtualBox on a Ubuntu 10.04LTS Linux machine substantially faster than on a Windows 7 machine (with a faster hard disk and faster processor), but it was faster than on a Windows Server 2008 machine running Hyper-V.

The really incredible thing was that Hyper-V was running on a disk array with fifteen spindles verses a single spindle for VirtualBox.

I really didn’t have any way to do a set of rigorous tests, but what I found was that as long as the disk wasn’t saturated, VirtualBox was able to handily outperform Hyper-V on every test (read or write) that I performed… it was only when I started to push near to the limits of the drive that VirtualBox and Hyper-V had similar disk IO performance.

I didn’t evaluate how VirtualBox performed on Linux with a disk array, but my guess is that it’s simply much more efficient at scheduling disk IO than Hyper-V; and likely Linux is more efficient at disk IO than Windows period.

I’m a huge fan of VirtualBox; and if I knew now what I knew about Hyper-V eighteen months ago I would have avoided it like the plague and simply used VirtualBox or Xen as a virtualization solution.

I’ll put a more thorough investigation of disk IO and VirtualBox verses Hyper-V performance on my “TO-DO” list; but I don’t expect it’ll float to the top until this Winter at the earliest; until then my advice is choose VirtualBox (or Xen).

Originally posted 2010-08-24 02:00:27.

Thinking Inside the VirtualBox

Sun Microsystems used to be a major player in the computer world; and I guess since Java belongs to Sun they are still a a fairly major force…

There’s a number of open source or free projects that Sun sponsors:

And, of course, it’s VirtualBox that has inspired this post.

VirtualBox 2.0.4 released on 24 October 2008, and from my initial experiences with it, it’s a contender.

A fairly mature x86/x64 virtualization framework for x86/x64 platforms.  VirtualBox runs on Windows, OS-X, Linux, and of course Solaris.

What sets it apart — well it’s to my knowledge the only fairly mature cross-platform virtualization framework that’s FREE on all platforms.

In general it doesn’t require hardware virtualization support with the exception that to run a x64 guest you must be on an x64 host with hardware virtualization.

Going through the list of features and playing with it there’s really nothing I couldn’t find that it didn’t do (and in playing with it, it seemed to work well)… the one feature that VirtualBox supports that none of it’s competitors had last time I looked (and that Hyper-V is sorely missing) is SATA (AHCI – Advanced Host Controller Interface) support… that provides much more efficient emulation of disk channel connections to the guest (and thus much better performance — and if you recall from my post on Hyper-V the fact that Microsoft doesn’t have SCSI boot support or AHCI support at all is what prevents me from moving to Hyper-V).

VirtualBox does apparently support VMWare virtual disks, but not Microsoft virtual disks (both of them provide open specifications, so my only conclusion is that Sun’s anti-Microsoft bias is at play which is sad since VirtualPC, Virtual Server, and Hyper-V account for a fairly substantial segment of the market, and a growing segment).

Like any product, you really need to carefully evaluate it based on your needs, but my feeling is that certainly for Mac users this might be the choice if you don’t want to by Parallels Desktop… and for Windows desktops this looks to be a very good.

NOTES:

On Windows if you want to use this on a server host machine (ie one that doesn’t require users to control the virtual machine) VirtualBox doesn’t really provide any interface for controlling machines in this manner; however, you can launch a VirtualBox machine from the command line, so you can have your server start up VirtualBox sessions at boot… though there are no tools provided by VirtualBox for managing running instances started this way.  My recommendation is that the VirtualBox team add a tool to manage and launch instances in a server environment.

On Windows (and other OSs) the way VirtualBox handles host networking (the default is a NAT’d network through the host… which could have some performance impact) is buy using the TUN/TAP driver.  Certainly they way Microsoft handles virtualization of the network adapter is far slicker, and I found that using host networking is not as reliable as NAT; hopefully this is an area where there will be some improvement.

Lastly, I haven’t run any actual performance tests head-to-head with  Parallels, VMWare, VirtualPC, and Virtual Server… but I can tell you that guests “feel” substantially faster running under VirtualBox (I was quite impressed — and surprised).


VirtualBox

Originally posted 2008-12-08 12:00:55.

Desktop Sharing

Maybe I’ve become spoiled, but I just expect desktop sharing (remote control) to be easy and fast.

Nothing, absolutely nothing compares to Microsoft’s RDP; and virtually any Windows machine (except home editions) can be accessed remotely via RDP; and all Windows machines and Macs can access a remote Windows machine.

Apple has their own Remote Desktop Client, and it works well — but it’s far from free (OUCH, far from free).  And Apple does build in VNC into OS-X (can you say dismally slow)… but they don’t provide any Windows client.

Linux and other *nix operating system you can use an X session remotely; or VNC (zzzzzzzzzzzzz again, slow).

As a “universal” desktop sharing solution VNC isn’t horrible (and it’s certainly priced right, and there’s plenty of different ports and builds of it to choose from), but it’s old school and old technology.

I personally think it would be a great standard to have an efficient remote desktop sharing standard, that all computers (and PDAs) could use… one ring — eh, got carried away there; one client could talk to any server, and operating system vendors would only need optimize their server and their client, other operating system vendors would do the same…

Originally posted 2009-02-23 01:00:41.

Elive – Luxury Linux

I’ll have to start my post off with what may seam like a very unfair comment; and it may be.

I’ll prefix this with I don’t ever feel comfortable with individuals or companies who try and charge for Open Source software when they don’t offer anything tangible for that money, and they don’t allow (and encourage) you to try out what you’re going paying for before you are asked to pay for it.

Elive falls squarely into this category.

You cannot download a “stable” version of Elive unless you make some donation (I believe $10 is the minimum donation) from the publishers site (you certainly can find torrents and ftp links to download it from other sites if you’re willing to put a few minutes into it).

Strictly my opinion; but I suspect the publisher realizes that no one would ever pay him for a “stable” version of Elive because what he passes off as stable isn’t.

When Elive boots, it’s striking, and all the applications that are installed with it seem to work nicely.  The interface, while not 100% Mac-like, is intuitive and easy to use…

So why start with such a strong negative stand?

Easy, Elive just isn’t stable.  It’s mostly form with little function.

What’s included on the CD seems to work fairly well, but start updating components or installing additional software (the VirtualBox guest additions started me on the road to ruin) and then the trouble starts… laughingly you have an environment with the stability of Windows 9x on junker hardware rather than OS-X (or Linux).

I suspect that the failing of Elive is that it isn’t a collaborative project of many people; nor is it a commercial venture from a publisher with the resources to adequately test it.

I simply wouldn’t pursue it the way it’s being pursued — but I like quality, and would simply not be comfortable asking for donations from people who will probably end up not being able to use the version they donated to (and there’s no mention that you get upgrades for life for free or only need donate again when you feel you’ve gotten something of substance).

My advice… look at the free “unstable” build, play with it, make it do what you want it to do — when it crashes move on; don’t expect a great deal more from the “stable”.

Hopefully, though, others will look at Elive and see the potential and we’ll see another distribution that is every bit as flashy and way more stable.

Elive

Originally posted 2010-01-04 01:00:17.

Windows Live Essential 2011 – Live Mail

Or perhaps better titled: Why I continue to use a product I hate.

When Outlook Express debuted many years ago Microsoft showed the possibility of creating a email reader for Windows that was clean,simple, and powerful… and for all the problems of Outlook Express it worked.

When Microsoft shipped Windows Vista they abandoned Outlook Express in favor of Windows Mail; largely it appeared to be the same program with a few changes to make it more Vista-like.

But not long after Windows Mail hit the street, Microsoft decided to launch Windows Live Mail, and what appears to be a totally new program modeled after Outlook Express / Windows Mail was launched.  I say it was new because many of the bugs that were present in the BETA of Windows Live Mail were bugs that had been fixed in the Outlook Express code line years before (as an interesting note, several of the bugs I personally reported during the BETA of Windows Live Mail are still present in the newest version – 2011).

The previous version of Live Mail was tolerable; most of the things that were annoying about it had fairly simple ways to resolve them — and in time, maybe we’ll all figure out ways to work around the headaches in 2011; but I just don’t feel like putting so much effort into a POS software package time and time again…

And for those of you who say it’s “FREE” so you get what you get, I’d say, no — it’s not exactly free… Microsoft understands that software like this is necessary in order to have any control over user’s internet habits, so it isn’t free — you’re paying a “price” for it.

Plus, there are other alternatives… Thunderbird for one.

Why don’t I use Thunderbird… simple, there is one “feature” lacking in Thunderbird that prevents me from embracing it.  You cannot export account information and restore it.  Sure Mozbackup will let you backup a complete profile and transfer it to another machine — but I want access to individual email accounts.

Why?  Well, here’s the scenario that I always hit.

I travel, and I tend to take my netbook with me when I travel — and often I’m using my cell phone to access the internet… while it’s “fast” by some standards… if you were to re-sync fifty email accounts each with a dozen IMAP folders, you’d take all day.  Further, most of those email accounts are uninteresting on a day-to-day basis, particularly when I travel — I only want to access a couple of those accounts for sure, but I might want to load an account on demand (you never know).  What I do with Live Mail is I have all the IAF files for all my email accounts stored on the disk (I sync them from my server), and I setup the mail program by loading the three or four that I use routinely, the others I only load as I need them, and I remove them from Live Mail when done.

OK — so that doesn’t fit you… here’s another.

You’ve got several computers, and you’d like to setup your email quickly and painlessly on all of them… but you don’t need all your email accounts on everyone of them — plus you add and remove accounts over time.  Again, Live Mail and it’s import/export handles this nicely.  You simply export a set of IAF files, and then import the ones you want on each machine.

The question is why doesn’t Thunderbird have this ability?

Well, there was a plug in for an older version of Thunderbird that did kinda this; of course it didn’t work that well for the version it was written for, and it doesn’t work at all for newer versions.

One more that I consider an annoyance (but it’s probably slightly more than that) is that there is no easy way in Thunderbird to change the order of accounts in the account window — and they’re not order alphabetically (that would make too much sense), they’re ordered chronologically (based on when you created them).  So you can re-order them, if you delete the accounts and add them back in the order you’d like them to appear; but wait, you can’t add an account any way in Thunderbird by type in all the information again.

And if you’re thinking, OK so write a plug-in that manages account ordering and import/export.  Sure, that would be the “right” thing to do if Thunderbird really had an interface to get to that information easily — but no, it appears you’d have to parse a javaScript settings file… oh joy.

These should be core features of Thunderbird; and in my mind they are huge barriers to wide acceptance.

Originally posted 2010-11-12 02:00:32.