Entries Tagged as 'Technology'

Microsoft Office

Microsoft Word for MS-DOS shipped in September 1893.

In January 1985 Microsoft shipped Word 1.0 for Macintosh and Word 2.0 for DOS.  In September they followed with Excel 1.0 for Macintosh.

In September 1886 Microsoft shipped Microsoft Works for Macintosh.  Followed in October by Word 3.0 for Macintosh (skipping version 2.0) and Word 2.0 for DOS.

In July 1987 Microsoft acquires Forethought and with that the basis for PowerPoint.  In September PowerPoint 1.0 for the Macintosh is shipped.

In July 1988 Microsoft ships PowerPoint 2.0 for the Macintosh.

In June 1989 Microsoft ships Office 1.0 for the Macintosh.

In May 1990 Microsoft ships PowerPoint 2.0 for Windows and in October Office 1.0 (which includes Excel 2.0, Word 2.1, and PowerPoint 2.0).

In January 1991 Microsoft ships Excel 3.0 for Windows.  In October Word 2.0 for Windows.

In August 1992 Microsoft ships Office 3.0 for Windows (includes PowerPoint 3.0, Word for Windows 2.0, and Excel 4.0).  In November Microsoft ships Access 1.o.

In September 1993 Microsoft ships the one millionth copy of Access, and Access 1.1 is the number one selling PC database.  In November Office 4.0 for Windows ships and by the end of the more than ten millions copies of Word are in use.

In May 1994 Microsoft ships Access 2.0 for Windows and Office 4.3 Professional for Windows (adding Access 2.0 to the Office 4 package).

In August 1995 Microsoft ships Office 95 supporting it’s new flag ship operating environment Windows 95.  By the end of the more than 30 million people now use Excel.

In April 1996 Exchange Server 4.0 is released as an upgrade to Microsoft Mail 3.5.

In January 1997 Microsoft Outlook 97 ships.  In March Exchange Server 5.0.  In November Office 97 is introduced and sells more than 60 million copies.

In January 1998 Office 98 for the Macintosh ships (Word 98, Excel 98, PowerPoint 98, and Outlook Express).  In March Outlook 98 is introduced on Windows, and over 1 million copies are sold by May.

In March 1999 Access 200 is released which enabled integration with Microsoft SQL Server.  In June Office 200 ships and attempts to bring web integration to the office platform.

In October 2000 Exchange Server 2000 is shipped and integrated e-mail, voice mail, and fax.

In March 2001 Office SharePoint Portal Server 2001 is shipped.  In May Office XP ships to support Microsoft new flag ship operating system.

In October 2003 Office 2003 ships along with Office SharePoint Portal Server 2003.  OneNote and InfoPath are introduced as parts of the Office system.  SharePoint is offered as a free addition to Windows Server 2003.  The Office logo is updated from the old puzzle image to it’s current form. Exchange Server 2003 is shipped.

In April 2005 Microsoft acquires Groove and adds it to the Office suite.

In December 2006 Exchange Server 2007 is shipped.

In January 2007 Microsoft ships Office 2007 and SharePoint Server 2007.

In March 2008 Office Live debuts, by September 1 million users are signed up.  In October Office Web applications are announced.

In April 2010 Exchange 2010 is shipped.  In July Office 2010, Project 2010, and SharePoint 2010 are previewed.  In September Office Web Apps are previewed.  In October Microsoft introduces Office Start 2010,  In November Office 2010, SharePoint 2010, Visio 2010, and Project 2010 are available as a public beta.  Office Mobile 2010 is announced and available as a public beta.


Microsoft certainly deserves a great deal of credit for pushing the envelope for office productivity applications.  Gone are the days of archane key sequences in Word Perfect and hardware incompatibilities in Visi-Calc…

Many companies choose to use Microsoft products because that is what they know, and that is what Microsoft’s huge sales force promotes… is Office 2010 in your future or will you choose a different coarse?

Microsoft Office Timeline

Originally posted 2010-01-19 02:00:07.

Vista Sidebar Gadgets

There’s a ton of sidebar gadgets for Vista (just open up the gallery with “add gadgets” and select the “Get more gadgets online” link at the bottom right hand corner to have a look at the ones on Microsoft’s gadget site…

Needless to say, most of the gadgets are CRAP, in fact, most of the gadgets that ship with Vista are lame — and to illustrate that, I don’t use ANY of the gadgets that shipped with Vista.

The clock in Vista takes too much room and only shows the time.  A better solution is the 12HourTime gadget, it shows the time, day of the week and date in about 2/3 the space.

The CPU meter tha ships with Vista is also lame, the mCPU meter seems to do a good job (especially for multi-core CPUs).

And because we’re all too lazy to get up and look out the window… the Weather Channel gadget does a great job at telling you the current conditions (at a reporting station near you).

There are several other gadgets that you might enable from time to time when you’re doing something… the uTorrent monitor, MSNGasPrice, AstronomyCenter, etc all could be useful — and of course that also depends on your interests.

One of the nice things about gadgets is that you can remove them fairly easily, and unlike lots of software they don’t pollute your system.

Originally posted 2008-05-15 11:34:01.

Ubuntu – Desktop Search

Microsoft has really shown the power of desktop search in Vista and Windows 7; their newest Desktop Search Engine works, and works well… so in my quest to migrate over to Linux I wanted to have the ability to have both a server style as well as a desktop style search.

So the quest begun… and it was as short a quest as marching on the top of a butte.

I started by reviewing what I could find on the major contenders (just do an Internet search, and you’ll only find about half a dozen reasonable articles comparing the various desktop search solutions for Linux)… which were few enough it didn’t take very long (alphabetical):

My metrics to evaluate a desktop search solutions would focus on the following point:

  • ease of installation, configuration, maintenance
  • search speed
  • search accuracy
  • ease of access to search (applet, web, participation in Windows search)
  • resource utilization (cpu and memory on indexing and searching)

I immediately passed on Google Desktop Search; I have no desire for Google to have more access to information about me; and I’ve tried it before in virtual machines and didn’t think very much of it.


I first tried Beagle; it sounded like the most promising of all the search engines, and Novel was one of the developers behind it so I figured it would be a stable baseline.

It was easy to install and configure (the package manager did most of the work); and I could use the the search application or the web search, I had to enable it using beagle-config:

beagle-config Networking WebInterface true

And then I could just goto port 4000 (either locally or remotely).

I immediately did a test search; nothing came back.  Wow, how disappointing — several hundred documents in my home folder should have matched.  I waited and tried again — still nothing.

While I liked what I saw, a search engine that couldn’t return reasonable results to a simple query (at all) was just not going to work for me… and since Begal isn’t actively developed any longer, I’m not going to hold out for them to fix a “minor” issue like this.


My next choice to experiment with was Tracker; you couldn’t ask for an easier desktop search to experiment with on Ubuntu — it seems to be the “default”.

One thing that’s important to mention — you’ll have to enable the indexer (per-user), it’s disabled by default.  Just use the configuration tool (you might need to install an additional package):


Same test, but instantly I got about a dozen documents returned, and additional documents started to appear every few seconds.  I could live with this; after all I figured it would take a little while to totally index my home directory (I had rsync’d a copy of all my documents, emails, pictures, etc from my Windows 2008 server to test with, so there was a great deal of information for the indexer to handle).

The big problem with Tracker was there was no web interface that I could find (yes, I’m sure I could write my own web interface; but then again, I could just write my own search engine).


On to Strigi — straight forward to install, and easy to use… but it didn’t seem to give me the results I’d gotten quickly with Tracker (though better than Beagle), and it seemed to be limited to only ten results (WTF?).

I honestly didn’t even look for a web interface for Strigi — it was way too much a disappointment (in fact, I think I’d rather have put more time into Beagle to figure out why I wasn’t getting search results that work with Strigi).


My last test was with Recoll; and while it looked promising from all that I read, but everyone seemed to indicate it was difficult to install and that you needed to build it from source.

Well, there’s an Ubuntu package for Recoll — so it’s just as easy to install; it just was a waste of effort to install.

I launched the recoll application, and typed a query in — no results came back, but numerous errors were printed in my terminal window.  I checked the preferences, and made a couple minor changes — ran the search query again — got a segmentation fault, and called it a done deal.

It looked to me from the size of the database files that Recoll had indexed quite a bit of my folder; why it wouldn’t give me any search results (and seg faulted) was beyond me — but it certainly was something I’d seen before with Linux based desktop search.


My biggest conclusion was that Desktop Search on Linux just isn’t really something that’s ready for prime time.  It’s a joke — a horrible joke.

Of the search engines I tried, only Tracker worked reasonably well, and it has no web interface, nor does it participate in a Windows search query (SMB2 feature which directs the server to perform the search when querying against a remote file share).

I’ve been vocal in my past that Linux fails as a Desktop because of the lack of a cohesive experience; but it appears that Desktop Search (or search in general) is a failing of Linux as both a Desktop and a Server — and clearly a reason why choosing Windows Server 2008 is the only reasonable choice for businesses.

The only upside to this evaluation was that it took less time to do than to read about or write up!

Originally posted 2010-07-06 02:00:58.

The new SPAM medium…

It looks like Facebook and Twitter and the like are the new medium of choice for unethical companies to send SPAM via…

This morning I received a message from SurfCanister via both Facebook and Twitter (I don’t have an account on either of those, and both were sent to the same [free] email address).

I don’t do business with companies that send SPAM or any sort — and it appears that neither Facebook or Twitter have created sufficient safeguards to protect the public from companies with low ethics.

Here’s a good policy for both of them:

1) A single complaint of SPAM, suspend the offender’s account for 30-days.

2) Two or more complaints of SPAM, permanently close the offender’s account.

That should put a quick end to using social media for SPAM… though it seem to me that the social media companies are not very ethical themselves, and they seem to want to encourage this type messaging.

Someone might want to point out that California has an anti-SPAM law, and both Facebook and Twitter are headquarted in California.

Originally posted 2012-06-08 09:00:56.

Dynamic Sitemap

About two years ago I wrote a program that created a sitemap from a local copy of my web pages (I also wrote an automation wrapper so that I could do all my web sites along with other mundane tasks reliably).

When I installed WordPress over a year ago I really liked the fact that the sitemap plug in was capable of dynamically creating a sitemap when a request was made; and I set it as a goal to implement that on my web site.

Well, yesterday that goal was realized.

I wrote a simple PHP script that takes some meta information and creates a sitemap, either uncompressed or compressed based on what is requested.  I used a rewrite rule in my .htaccess file to allow search engines to continue to request the familiar sitemap.xml and/or sitemap.xml.gz file.

Now I don’t have to worry about creating and deploying a sitemap file when I change a file; I only have to make sure that the meta information is updated when I add or remove pages.  Plus, I incorporated the concept of dynamic pages, so that the sitemap can accurately report fresh content.

At the moment I haven’t decided if I’m going to “publish” this code or not.  It’s likely I will once I clean it up and actually test it more completely.  Like I said, it isn’t rocket science – it just takes a little knowledge of what a sitemap is, and you can get everything you need from sitemaps.org; a little ability in PHP, and a basic understanding of how to write a re-write rule for Apache.

Originally posted 2010-02-23 01:00:50.

Virtualization Solutions

On windows there’s basically three commercial solutions for virtualization, and several free solutions… wait one of the commercial solutions is free (well when you buy the operating system), and the other is partially free…

  • Microsoft Virtual PC (runs on both servers and workstations)
  • Microsoft Virtual Server (runs on both servers and workstations)
  • Microsoft Hyper-V (runs only one Windows Server 2008)
  • Parallels Workstation (runs on workstations)
  • Parallels Server (runs on both servers and workstations)
  • VMware Player (runs on both servers and workstations)
  • VMware Workstation (runs on both servers and workstations)
  • VMware Server (runs on both servers and workstations)
  • Citrix (aka XenSource)

For Intel based Mac you have commercial solutions

  • Parallels Desktop
  • Parallels Server
  • VMware Fusion

And for Linux you have the following commercial solutions, and many free solutions (Xen being one of the leaders)

  • Parallels Desktop
  • Parallels Server
  • VMware Player
  • VMware Workstation
  • VMware Server
  • Citrix (aka XenSource)

And for bare metal you have

  • Parallels Server
  • VMware


I’m not going to go into details on any of these, I just wanted to give at least a partial list with a few thoughts.

If you’re new to virtualization, use one of the free virtualization solutions.  You can try several of them, and many of them can convert a virtual machine from another vendor’s format to it’s own, but learn what the strengths and weaknesses are of each before you spend money on a solution that might not be the best for you.

Microsoft Virtual Server has some definite performance advantages over Microsoft Virtual PC… there are some things you might lose with Virtual Server that you might want (the local interface); but Virtual Server installs on both desktop and workstation platforms, so try it.

For Mac I definitely like Parallels Desktop better than VMware Fusion; but you may not share my opinion.  VMware claims to be faster, though I certainly don’t see it.  And I might add, that if you have a decent machine you’re running virtualization software on, fast isn’t going to be the number one concern — correctness is far more important.

Also, with each of the virtualization systems, hosts, and guests there are best practices for optimizing the installation and performance.  I’ll try and write up some information I’ve put together that keep my virtual machines running well.

For the record, I run Microsoft Virtual Server 2005 R2 (64 bit) on Windows Server 2003 R2 x64 SP2, and on Windows Vista Ultimate and Business x64 SP1; works well.  And I run Parallels Desktop v3 on my Macs.

For the most part my guests are Windows XP Pro (x86) and Windows Server 2003 (x86); I don’t really need 64-bit guests (at the moment), but I do also run Ubuntu, Debian, Red Hat, Free Spire, etc linux…

Like I said, figure out your requirements, play with several of the virtualization systems and spend your money on more memory, perhaps a better processor, and stick with the free virtualization software!

Originally posted 2008-05-18 20:25:18.

Blogging Software

I looked at quite a few blogging solutions before I settled on Word Press, and I wanted to share some of my thoughts.

There are a number of free blogging services; indeed Word Press operates one.  Generally when you use a free service you get advertising on your pages (I really didn’t care for that, just like I don’t care for advertising on my web page).  But the one thing that always is the problem with free services is, you get what you get… and you’re stuck with it.

I tend to like to have full control over my information, so I decided to install the software to run my blog so that I could change ANYTHING I wanted to about it.  And Word Press is open source, written in PHP, so it’s realatively straight forward to make any changes you want.  And, of course, the price is right — FREE.

There are tons of addons (widgets, themes, and pluggins); though not all of them are free.  There are commerical addons available for Word Press as well.

I looked at several other solutions, originally ASP based solutions (when I was running Microsoft IIS), but then when I switched to a hosting company something that ran on PHP, PERL, Python became a more practical solution (my hosting company provides more flexible services with a Linux hosting plan than with a Microsoft hosting plan).

Here’s a list blogging software written in PHP (in no particular order) you might want to take a look at:

And here’s a list blogging software written for ASP (in no particular order) you might want to take a look at:


You’ll want to evaluate blogging software with your requirements in mind; but one thing to keep in mind is most blogging software supports importing from another system via RSS.  That might not be a requirement for the first blogging system you choose (but I think it shows a level of maturity in the software), it certainly will be for any blogging software after that.

One thing to keep in mind, the information in a blog site is written into a database; and while it wouldn’t be my first choice, you can dump the database to get the text of the ariticles, but it’s unlikely your going to be able to directly import it into another system’s database (without a lot of work).

Originally posted 2008-05-15 11:24:35.

Nothing but the necessities…

In a school district that is struggling to keep teacher’s it’s amazing that that the Santa Rosa County Florida School District can find the money (and need) for 90 iPad2s for administrators (it’s also amazing that there are 90 administrators in a  county with only about 150,000 residents).

I’m glad to see that my tax dollars are well spent on essential items to insure that today’s school children will be properly educated and that the administrators responsible for overseeing that education will have new toys at the disposal to sit mostly unused in their desks — after all, an edict has been issued by the school district that these devices are only to be used in a professional capacity.

I wonder, will it be grounds for immediate termination the first time a games is played on,, a facebook post is made from or personal email is sent via one these essential educational tools — inquiring minds want to know.

My personal feeling would be this money would be better spent offsetting the $4.4 million dollar shortfall for the 2011-2012 school year that is necessitating the layoff of teaching staff — of course, why should I be surprised about iPad2s for administrators, after all most of them just got raises to address the inequities in their pay (I guess they couldn’t afford their own iPad2s — though they seem to expect teachers to buy a great deal of supplies for their classrooms out of their considerably smaller salaries).

Originally posted 2011-08-15 02:00:22.


No, not a swarm of bees, wasps, hornets, or yellow jackets — I’m talking about file sharing technology.

First, there’s absolutely nothing illegal or immoral about using file sharing technology for file sharing and distribution, just as there’s nothing illegal or immoral about using hyper-text (http) or file transfer (ftp) technology.  It all has to do with the content you’re trying to exchange, not the system you’re using to exchange it.

There are many legitamate uses for BitTorrent and other P2P technologies.  Here’s a perfect example.

A small company has a number of offices spread throughout the world, and no one location has an internet connection with significant bandwidth (let’s say for argument, they all had a high end class of DSL service, but none of them have fibre).  This company would like to distribute it’s trial software, but because of the economics can’t afford to pay for additional bandwidth or for a content delivery system — they could opt to “aggragate” the bandwidth of all of their offices by providing a torrent and running torrent servers at each location — that would allow the nodes with the most bandwidth available to satisfy requests, and any individuals who had downloaded the software and elected to continue seeding would be able to source it as well.  While no one individual might get the software as quickly (though that’s not necessarily true), many more people would be able to get the software sooner, and at no additional cost; thus the company could meet it’s budgetary constraints and might not have to consider increasing the amount they need to charge for the software to cover operating expenses.

Swarming technology is real, it’s practical, and it’s a solution for a number of problems.

Swarms are highly fault tolerant, they’re highly distributed, and they dynamically adjust to changing conditions…

While any technology can be abused and misused, there’s nothing inherently bad in any of the P2P technologies.  Just because bank robbers use pens to write hold up notes we didn’t outlaw the pen or pencil…

Originally posted 2009-01-12 12:00:19.

Computer Tid Bits; Malware

Computer viruses, worms, trojans, etc are on the rise… if your computer is connected directly to the internet (or on a public wireless network) you’ll definitely want to have a firewall enabled.  The firewall in Windows XP SP2 (or better) and Vista is reasonably good (so there’s no reason to spend money on one).

Also, you should definitely consider running Windows Defender (free from Microsoft) and a Virus scanner.

Two good free Virus scanners are Avast and AVG.

Avast is extremely thorough, but can put a bit of a load on lower end systems.  AVG isn’t as thorough, but a great deal lighter on CPU.  Also, Avast will require you to register for a key — you can use a throw-away email address (from my experience they don’t seem to SPAM).



Originally posted 2008-05-09 18:20:12.