Entries Tagged as 'Linux'

Linux usability

While doing my preliminary look at usability in several Linux distributions that had adopted a Mac-ish paradigm I decided I needed to lay several ground rules to fully review them.

First, I decided that using a virtual machine was fine for getting intial impressions, but that just wasn’t going to be acceptable for a complete review… and I also decide that doing a review on only one piece of hardware wasn’t going to give me a very good idea of what problems a user might see related to the computer.

It’s certainly no problem for me to find a computer or two to install these Linux distributions on and run them through their paces; however, I don’t have any “low-end” hardware, so my tests are going to use fairly current generations of hardware, so be aware that my impressions might not match your impression if you’re planning on running these on hardware that is more than a couple years old (and by a couple year old I mean hardware who’s components were current no more than two years ago).

I’ll perform the following:

  1. Install the distribution (without requiring any settings manually)
  2. Update itself (and applications)
  3. Start up, shut down, log on, log off
  4. Browse the web (that’s a given)
  5. Read email (including setting up the email program)
  6. Play a CD (music)
  7. Play several music files
  8. Play a DVD (movie)
  9. Play several video files
  10. Edit a WYSIWYG document
  11. Edit an image
  12. View and print a PDF
  13. Access a thumb drive
  14. Access files stored on a network device
  15. Access secure digital media (though a USB card reader)
  16. Scan an image
  17. Open a ZIP archive; create a ZIP archive
  18. Email an attachment, recover an email attachment
  19. Install a new (and useful) application
  20. Alter the appearance (preferably using a theme)

Beyond these simple tests I’ll try and appraise the simplicity, clarity, and ease of use of the interface… I’ll also comment on the overall appearance, the look and feel.

Originally posted 2010-01-08 01:00:19.

Linux BitTorrent Clients – Follow-Up

I’ve been using several Linux bit torrent clients fairly heavily for the past week or so, and I have a few new comments about each of the “contenders” — below I’ve ordered them as I would recommend using them.

KTorrent · KTorrent might be a little “fat”, but it works, and it works very well — particularly when dealing with a large number of torrents simultaneously.  This is my pick.

TorrentFlux · TorrentFlux is probably the best solution you’ll find for a torrent server.  Simply said, it works fine (though I don’t know that I’ll continue to use it, simply because it doesn’t seem to be being improved, and it’s far from perfection).

Transmission · Transmission is simple, and that simplicity seems to pay off — it works, it works well.

qBittorrent · qBittorrent works fairly well for a small number of simultaneous torrents; but if you want to download large numbers of torrents or seed large numbers of torrents stay away from this one — it actually crashes, and unless your goal is just to watch the integrity of your torrents be checked and over and over you can do much better.

Deluge · Deluge was what I really wanted to like; and it seemed to work, but it has two major problems — it doesn’t handle large numbers of torrents well, and it doesn’t properly handle port forwarding (either through UPnP / NAT-PMP or when you try and set the port forwarding manually).  We’ll just leave it at it has issues (that apparently are fairly well known) and the progress on it is glacial in it’s pace.

Moving torrents from one client to another isn’t all that hard to do, a little time consuming maybe… but once you figure out how to do it, and let your data files re-check, you’ll be on your way.

My experience over the past week reminds me that you can do your diligence by researching every fact and figure about a program all you like; but until you put it through the paces you just won’t know.

NOTES: My test included about 550 torrents totaling just under half a terabyte in total size.  I required that ports be forwarded through a firewall properly (either via UPnP, NAT-PMP, or by hand), and that I be able to control the total number of active torrents (preferably with control over uploads and downloads as well), and be able to restrict the bandwidth (a scheduler was a nice touch, but not a requirement).

Originally posted 2010-08-25 02:00:30.

Video Encoding

A little over a year ago one of my friends with a Mac wanted to get into re-encoding video; I knew about the tools to do it on a PC, but none of the tools really had a OS-X port at that time, so I set out on a quest to find tools that could enable a person who didn’t know much about video encoding to accomplish it.

One of the first tools I stumbled on was HandBrake; it was an Open Source project leveraging off of a number of other Open Source products intended on creating a cross platform suite of tools for video encoding that was reasonably straight forward to use and produced reasonable good results.

Well, the version I tested was a near total failure… but the project showed promise and I keep tabs on it for quite some time.

Over the past year it’s steadily improved.  In fact, I’m probably being a little hard on it, since right after I played with an early version a much improved version was available that did work, and did allow my friend to accomplish what he wanted.

Last month HandBrake released a new version — a much improved version.

With Windows, OS-X, and Linux versions you can try out HandBrake for yourself and see the results.

I did two separate tests (and for some reason I always use the same two DVD titles — Saving Private Ryan, and Lord of the Rings — the reason is that both movies have a wide range of  video type from near still images to sweeping panoramic views to everything in motion (blowing up)…

I had two separate machines (a Q9300 and a Q9400 both with 8GB of DDR2) doing the encodes, and did both normal and high profiles; one test was H.264 into a MPEG4 container with AAC created from the AC3 5.1 track; the other was H.264 into a MKV container with AAC created from the AC3 5.1 track in addition to AC3 5.1 pass-through and Dolby Surround pass-through with [soft] subtitles.

For the high profiles: Lord of the Rings took a little over three hours; Saving Private Ryan took just under two and a half hours — so don’t get in a hurry, in fact, run it over night and don’t bother the computer(s).

The high profile achieved about a 2:1 reduction in size; the normal profile achieved about a 4:1 reduction in size.  The high profile’s video was stunning, the normal profile’s video was acceptable.  The AAC audio was acceptable; the AC3 5.1 was identical to the source, and in perfect sync.

There are a number of advantages to keeping your video in a MPEG4 or MKV container verses a DVD image… it’s much easier to catalog and play, and of course it’s smaller (well, you could keep the MPEG2-TS in a MKV and it would be identically sized, but I see little reason for that).

The downside of RIPping your DVDs is that you lose the navigation stream and the extra material.  Do you care???

HandBrake will read source material in just about any format imaginable (and in almost any container as well)… you can take a look at it’s capabilities and features online.

I’ve got some VCR capture streams in DV video that I’m encoding now — trying a few of the more advanced settings in HandBrake to see how it works (well, that’s not really testing HandBrake, that’s testing the H.264 encoder).  My expectation is that once I get the settings right, it will do a fine job; but with video captures you should never expect the first try to be the best (well, I’m never that lucky).

While HandBrake is very easy to use, your ability to get really good results from it is going to partially depend on how willing you are to learn a little about video re-encoding (which will require a little reading and a little experimentation).   But that said, NO product is going to magically just do the right thing in every case…

Overall I would say that HandBrake is one of the best video encoders you’re going to find, and you cannot beat the price — FREE!

Here’s some additional notes.

For Windows 7 you will want to download the DivX trial and just install the MKV splitter (nothing else is needed) so that Windows 7 can play media in a MKV container using it’s native CODECs.

With Windows Media Play 12 and Media Center I haven’t figured out how to switch audio streams; so make sure you encode with the audio stream you want as a default as the first stream.  With Media Player Classic and Media Player Classic Home Cinema it’s easy to select the audio stream.  Also, Windows Media Player will not render AC3 pass-through streams, it will just pass them through the SPDIF/Toslink to your receiver — so you won’t get any sound if you’re trying to play it on your PC.

Don’t delete any of your source material until you are certain that you are happy with the results; and you might want to backup your source material and keep it for six months or so just to be sure (yeah — I know it’s big; but a DVD will fit on a DVD).

Handbrake

Originally posted 2009-12-17 01:00:07.

Usability Summary

I think I can sum up the real problem with Linux or any open source system where there’s no real usability mandate…

Developers make arbitrary decisions that suit their needs without any regard to how others will view their decisions or even figure out how to use what they do… the real difference between Windows and OS-X and Linux is that two of those are the cooperative efforts of “experts” who try very hard to address the needs of a target audience who wouldn’t be capable of writing their own operating system.

And, of course, with something like Linux it’s geometrically worse than most open source software since any given Linux is the culmination of hundreds of separate open source modules put together in a completely arbitrary fashion.

It really is funny that what I’ve been describing as a lack of cohesiveness is layered; and I suspect no matter what the intentions of a single developer to try and wrap it inside a nice pretty shell that gives a forward facing pretense of a system that was planned and targeted for productivity, the ugly truth of how much a patch work it is will show through… and we can look back on early versions of Windows and MacOS and see just that… it’s really only been within the last five or six years that those systems have risen to the point that they are in fact fairly cohesive, and designed to be tools for people to solve problems with; not projects for people to build for the sole purpose of developing a life of their own.

Without some unifying direction, the only Linux I can see suceeding is Android; and that my friends is likely to become a collection of closed source tools running on top of an open source kernel.  Trust me, you haven’t seen an evil empire until Google gets on your desktop, phone, settop box, etc…

Originally posted 2010-01-11 01:00:10.

Linux on the desktop

I’ve been experimenting with Linux as a server for several months now; and I have to say for the price it’s a clear winner over Microsoft Windows Server 2008.

Other than desktop search, Linux has been a clear winner across the board.  Network file sharing, application services, etc all seem to work, and work well.  Plus with the webmin GUI for managing the server, it’s extremely easy — easier in fact that figuring out where to go to do the task at hand in Windows Server 2008.

With my success using Linux as a server, I have decided (once again) to investigate Linux as a desktop replacement for Windows… after all, how much does one normally do with a desktop?

I experimented briefly with Ubuntu on a laptop when I was cloning the drive in it, but I didn’t put it through exhaustive paces (I was quite impressed that Ubuntu auto-magically installed drivers for all the hardware in the notebook; though that feat was no better than Windows 7).

I need to go over my requirements a few more times before I start the test, but what I believe is important is:

  • Hardware support; including multiple displays, scanners, web cams, etc
  • Office (which OpenOffice will work the same as it has been on Windows)
  • Financial Management (I guess I’ll have to move over to MoneyDance; it’s not free, but it’s fairly well thought out)
  • Media Playback (VLC runs on Linux just like Windows, plus there are a number of media players I’ll take a look at)
  • DVD RIPping (my last try to do that on Linux wasn’t very successful)
  • Video transcoding (I think HandBrake is broken on the current version of Ubuntu — so that might take a little work)

I’ll also evaluate it for ease of use and customization…

The evaluation will be done on an Intel DG45ID motherboard (G45 chipset)with an Intel Core2 E7200 with 4GB DDR2, multiple SATA2 hard drives, SATA DVD-RW, and I’ll test with both a nVidia 9500 and the Intel GMAC controller (X4500HD) running both a 32-bit and 64-bit Ubuntu 10.04LTS distribution.

Let the fun begin!

Originally posted 2010-08-12 02:00:28.

conglomeration

con·glom·er·a·tion (kn-glm-rshn)
n.

    1. The act or process of conglomerating.
    2. The state of being conglomerated.
  1. An accumulation of miscellaneous things.

The American Heritage® Dictionary of the English Language, Fourth Edition copyright ©2000 by Houghton Mifflin Company. Updated in 2009. Published by Houghton Mifflin Company. All rights reserved.


conglomeration [kənˌglɒməˈreɪʃən] n

  1. a conglomerate mass
  2. a mass of miscellaneous things
  3. the act of conglomerating or the state of being conglomerated

Collins English Dictionary – Complete and Unabridged © HarperCollins Publishers 1991, 1994, 1998, 2000, 2003


conglomeration a cluster; things joined into a compact body, coil, or ball.

Examples: conglomeration of buildings, 1858; of chances; of Christian names, 1842; of men, 1866; of sounds, 1626; of threads of silk worms, 1659; of vessels, 1697; of words.

Dictionary of Collective Nouns and Group Terms. Copyright 2008 The Gale Group, Inc. All rights reserved.


The SCO infringement lawsuit over the Unix trademark is over… the Supreme Court has ruled that Novell owns the Unix trademark and copyright, and SCO has no grounds for it’s litigation against.  Just as Microsoft owned and retained the Xenix copyright while SCO distributed that operating system, so Novell retained the Unix copyright while SCO distributed that operating system.

While means, Novell now has a prime asset — and could be ripe for harvesting (that’s a poetic way to say merger, take-over, buy-out).

Which will likely be bad for Linux.

WHAT?

Yep, take a look at what happened when Oracle purchased Sun (one of the largest companies supporting Open Source innovation in Linux, virtualization, etc) there’s definitely movement in Oracle to retract from the Open Source and free (free – like free beer) software efforts that Sun was firmly behind.

Consider what happens if a company acquires Novell and uses the SystemV license from Novell to market a closed source operating system, and discontinues work on Suse; or at minimum decides it doesn’t distributed Suse for free (free – like free beer).

“Live free or die” might become a fading memory.

Originally posted 2010-06-05 02:00:18.

Disk Bench

I’ve been playing with Ubuntu here of late, and looking at the characteristics of RAID arrays.

What got me on this is when I formatted an ext4 file system on a four drive RAID5 array created using an LSI 150-4 [hardware RAID] controller I noticed that it took longer than I though it should; and while most readers probably won’t be interested in whether or not to use the LSI 150 controller they have in their spare parts bin to create a RAID array on Linux, the numbers below are interesting just in deciding what type of array to create.

These numbers are obtained from the disk benchmark in Disk Utility; this is only a read test (write performance is going to be quite a bit different, but unfortunately the write test in Disk Utility is destructive, and I’m not willing to lose my file system contents at this moment; but I am looking for other good benchmarking tools).

drives avg access time min read rate max read rate avg read rate

ICH8 Single 1 17.4 ms 14.2 23.4 20.7 MB/s
ICH8 Raid1 (Mirror) 2 16.2 ms 20.8 42.9 33.4 MB/s
ICH8 Raid5 4 18.3 ms 17.9 221.2 119.1 MB/s
SiL3132 Raid5 4 18.4 ms 17.8 223.6 118.8 MB/s
LSI150-4 Raid5 4 25.2 ms 12.5 36.6 23.3 MB/s

All the drives used are similar class drives; Seagate Momentus 120GB 5400.6 (ST9120315AS) for the single drive and RAID1 (mirror) tests, and Seagate Momentus 500GB 5400.6 (ST9500325AS) for all the RAID5 tests.  Additionally all drives show that they are performing well withing acceptable operating parameters.

Originally posted 2010-06-30 02:00:09.

File System Fragmentation

All file systems suffer from fragmentation.

Let me rephrase that more clearly in case you didn’t quite get it the first time.

ALL FILE SYSTEMS SUFFER FROM FRAGMENTATION <PERIOD>.

It doesn’t matter what file system you use one your computer, if you delete and write files it will become fragmented over time.  Some older file systems (like say FAT and FAT32) had major performance issues as the file system began to fragment, more modern file systems do not suffer as much performance lose from fragmentation, but still suffer.

If you want to argue that your writable file system doesn’t fragment, you haven’t a clue what you’re talking about, so read up on how your file system really works and how block devices work to understand why you just can’t have a file system that doesn’t fragment files or free space or both.

What can you do about fragmentation?

Well, you might not really need to do anything, modern disk drives are fast; and on a computer that’s doing many things at once the fragmentation may not have much of any impact on your performance, but after awhile you’re probably going to want to defragment your files.

The act of copying a file will generally defragment it; most modern file systems will attempt to allocate contiguous space for a file if it can (files that grow over time cannot be allocated contiguous, but they can be defragmented at their current size).

On many operating systems you can actually get programs that are designed to defragment your file system.

How often should you defragment your file system?

Well, I generally recommend you do it right after installing and updating your computer; and then any time you make major changes (large software installation, large update, etc).  But that you not do it automatically or an a routine schedule — there’s not enough benefit to that.

You can also analyze your disk (again using software) to determine how fragmented it is… and then defragment when it reaches some point that you believe represents a performance decrease.

Also, try and keep your disk clean, delete your browser cache, temporary files, duplicate files, and clutter — the less “junk” you have on your disk, the less need there will be for defragmenting.

Originally posted 2009-01-05 12:00:03.

7-Zip

I’ve written about 7-Zip before; but since we’re on the verge of a significant improvement I felt it was time to highlight it again.

7-Zip is a file archiver written by Igor Pavlov.  Originally only available for Windows, but now available for most every operating system.

7-Zip was one of the first archiving tools to include LZMA (Lempel-Ziv-Markov chain algorithm); and consistently demonstrated much higher compression ratios at much higher compression rates than any other compression scheme.

The next release of 7-Zip (9.10) will include LZMA2.

The source code for the LZMA SDK has been put into the public domain, and is freely available for use in other products.  The SDK includes the main line C++ course, ANSI-C compatible LZMA and XV source code; C#  LZMA compression and decompression source code; Java LZMA compression and decompression source code; as well as other source code.

You can read all the features of LZMA as well as download the Windows version of 7-Zip and locate links for pZip for *nix operating systems.  You can also do a search for tvx or vx for *nix based systems as well.

This is the only archive utility you need; it would have been nice had Microsoft chosen to base the folder compression in Windows 7 on the LZMA SDK, or at least made it easy to replace the compression module; but 7-Zip installs a Windows shell extension so you have a separate (though confusing for some) menu item for compression and decompression.

http://www.7-zip.org/

Originally posted 2010-01-21 01:00:14.

Linux – Desktop Search

A while ago I published a post on Desktop Search on Linux (specifically Ubuntu).  I was far from happy with my conclusions and I felt I needed to re-evaluate all the options to see which would really perform the most accurate search against my information.

Primarily my information consists of Microsoft Office documents, Open Office documents, pictures (JPEG, as well as Canon RAW and Nikon RAW), web pages, archives, and email (stored as RFC822/RFC2822 compliant files with an eml extension).

My test metrics would be to take a handful of search terms which I new existed in various types of documents, and check the results (I actually used Microsoft Windows Search 4.0 to prepare a complete list of documents that matched the query — since I knew it worked as expected).

The search engines I tested were:

I was able to install, configure, and launch each of the applications.  Actually none of them were really that difficult to install and configure; but all of them required searching through documentation and third party sites — I’d say poor documentation is just something you have to get used to.

Beagle, Google, Tracker, Pinot, and Recoll all failed to find all the documents of interest… none of them properly indexed the email files — most of the failed to handle plain text files; that didn’t leave a very high bar to pick a winner.

Queries on Strigi actually provided every hit that the same query provided on Windows Search… though I have to say Windows Search was easier to setup and use.

I tried the Neopomuk (KDE) interface for Strigi — though it just didn’t seem to work as well as strigiclient did… and certainly strigiclient was pretty much at the top of the list for butt-ugly, user-hostile, un-intuitive applications I’d ever seen.

After all of the time I’ve spent on desktop search for Linux I’ve decided all of the search solutions are jokes.  None of them are well thought out, none of them are well executed, and most of them out right don’t work.

Like most Linux projects, more energy needs to be focused on working out a framework for search than everyone going off half-cocked and creating a new search paradigm.

The right model is…

A single multi-threaded indexer running in the background indexing files according to a system wide policy aggregated with user policies (settable by each user on directories they own) along with the access privileges.

A search API that takes the user/group and query to provide results for items that the user has (read) access to.

The indexer should be designed to use plug-in modules to handle particular file types (mapped both by file extension, and by file content).

The index should also be designed to use plug-in modules for walking a file system and receiving file system change events (that allows the framework to adapt as the Linux kernel changes — and would support remote indexing as well).

Additionally, the index/search should be designed with distributed queries in mind (often you want to search many servers, desktops, and web locations simultaneously).

Then it becomes a simple matter for developers to write new/better indexer plug-ins; and better search interfaces.

I’ve pointed out in a number of recent posts that you can effective use Linux as a server platform in your business; however, it seems that if search is a requirement you might want to consider ponying up the money for Microsoft Windows Server 2008 and enjoy seamless search (that works) between your Windows Vista / Windows 7 Desktops and Windows Server.

REFERENCES:

Ubuntu – Desktop Search

Originally posted 2010-07-16 02:00:19.