Entries Tagged as 'Windows Server'

Virtual Server 2005 R2 with Internet Explorer 8

You’ve probably read my rant on IE8 and how broken it is.

If you have IE8, and you need to use Virtual Server 2005 R2 (and perhaps previous versions as well), and you’re tired of having to select compatibility mode manually all the time…

You can add a customer header to your web site to force IE8 into IE7 (compatibility) mode.

However, on a workstation (XP, Vista, etc) that means all of your web sites will force IE8 into IE7 mode; on a server (Server 2003, Server 2008, etc) you can set the header on only the virtual server web site.

Why Microsoft doesn’t issue a hot fix for this is totally beyond me… seem like it would be trivial for them to make the web service app send the META tag; or they could actually address the compatibility issues.

On Vista you’ll find the menu you need via:

  • Computer->Manage->Services and Applications->Internet Information Server->HTTP Response Headers->Add

And the Custom HTTP Response Header you’ll set and value is:

  • Name:  X-UA-Compatible
  • Value: IE=EmulateIE7

On other versions of Windows you just need to get to the IIS management console figure out how to set the custom HTTP header on a site (remember, workstation versions of Windows only have one web site so depending on the version of  Windows you’ll see either ‘default’ or nothing listed).

Originally posted 2009-08-27 01:00:02.

Hyper-V Transition

I’ve started to make the conversion of my servers and virtualization hosts to Windows Server 2008, and with Windows Server 2008 comes Hyper-V.

While I believe there has to be a rather substantial update to Hyper-V, most of  the initial feedback is promising, and the performance is good.

Since I already have an investment in virtualization using Virtual Server 2005 R2 (x64) Hyper-V is the logical choice (though you could consider others — Virtual Box would be a free alternative).

Microsoft makes it possible for you to import Virtual PC or Virtual Server machines only if you purchase the Virtual Machine Manager; for those of us who don’t run a virtual data center that might be a bit much (I think it was available through my MSDN subscription, but I really didn’t want to install it).  For those who don’t mind a few manual steps you can use Matthijs ten Seldam’s Virtual Server to Hyper-V import tool – VMC2HV (I’ve included links at the bottom of the post to his BLOG as well as direct to his SkyDrive for download).

With VMC2HV you will have to manually remove Virtual Machine Additions from your old virtual machines, and manually start the process of HAL upgrade and Virtual Intergration Services Install on your Hyper-V machine, but it’s very straight forward and Hyper-V will actually give you info-tips to guide you along.

The tool is fairly straight forward to use, the only thing you need to remember is to have it swap SCSI0 with IDE0 (if you used SCSI drives on Virtual Server, remember that Hyper-V can only boot from IDE drive; hopefully that will change soon).

Originally posted 2009-01-30 01:00:22.

System Update Readiness Tool for Windows

If you have any issue installing Windows V6 SP2 or an update for Vista or Server 2008 you might want to download and run the System Update Readiness Tool from Microsoft.

You can read about it and download it via the link below.

 

http://support.microsoft.com/kb/947821

Originally posted 2009-06-08 11:00:04.

bootrec.exe

Bootrec.exe, available as part of repair from the command line can resolve a number of start up issues on Windows.  It comes in quite handy for replacing the master boot record (MBR) and boot loader (a good way to remove a multi-boot manager like GRUB).

 Be sure you understand what you’re doing it you choose to use it.

 Use the Bootrec.exe tool in the Windows Recovery Environment to troubleshoot and repair startup issues in Windows

Originally posted 2013-11-13 17:00:09.

Microsoft Updates

I’ve got a new pet-peeve (like a had a shortage of them before)…

nVidia has been coming out with display updates for their video cards for Vista about once per month (OK — a little less often than that); and Microsoft has been dutifully pushing down certified drivers to users.

First, the big problem I have with the nVidia driver for my 9800s is that I periodically have the machine freeze and get a message that the display driver stopped responding (but has recovered)… maybe nVidia should be concentrating on fixing that issue and hold off on updates until there’s really some substantial progress [but that might negatively impact them re-naming old tehcnology and trying to sell it as something new].

OK — I digressed… but like I said, it’s a new pet-peeve, and I want to revel in it.

The really annoying thing is that every time Microsoft download and installs a new video driver the system resizes all my open windows and rearranges the icons (shortcuts) on my desktop…

Now perhaps this is only because I have a multiple display system… but reguardless you’d think the children in Redmond might have considered storing the previous state of windows BEFORE activating the new video driver and restoring it afterwards — after all, they are concerned with user experience, RIGHT?

RIGHT… I think the phase would be “experience THIS!”

Microsoft has come a long way in the last few years in making computers easier to use, and easier to maintain… but they (Microsoft) still fails to actually have people who use computers design feature for them… and that’s why using Windows has always felt like it was held together by chewing gum and string — BECAUSE IT IS.

I could do with one less version of Internet Explorer and a bit more work on polishing the overall user experience… and why all these “major” upgrades???  Why not just a continuous stream of improvements to each and every part of the system???

Originally posted 2009-08-22 01:00:10.

Windows 6 Service Pack 2

It’s out… it’s been in BETA for quite some time.

Just so you’re clear; Windows 6 covers all the Vista family and the Server 2008 family, and there’s an installer for 32-bit and one for 64-bit; there’s also a DVD image that you can install either from.

You can find a number of articles on the web telling you all about what was originally supposed to be in SP2, and what ended up in it… other than Bluetooth 2.1 and Blu-Ray support there isn’t that much that caught my eye as for “features”.

The big thing you will notice is that this makes Vista noticably faster… and includes the compcln.exe tool that allows you to remove previous component versions (saving disk space — of course once you do so, you cannot go back to previous versions… but if your machine is stable after SP2 you probably wouldn’t want to).

You must have SP1 installed first (Server 2008 comes with SP1 “pre-installed”).

You can access the Microsoft TechNet article via the link below and download the file(s) you desire.  At the moment SP2 is not included in automatic updates, but it will likely be pushed out soon.

http://technet.microsoft.com/en-us/windows/dd262148.aspx

Originally posted 2009-06-07 11:00:22.

Online Capacity Expansion

Well…

  • Call me old fashion…
  • Call me conservative…
  • Call me a doubting “Thomas”…
  • Call me tickled pink…
  • Call me surprised…

I just finished adding four additional spindles to one of my virtual hosts; when I originally built it out I only had four spindles available, and didn’t want to buy more since I knew I would be freeing up smaller spindles for it soon.

The first task was to have the RAID software add the new spindles to the array, then to “expand” the array container… the first step took only a few moments, the second step took about 20 hours for the array controller to rebuild / expand the array.

The second task was to get Windows to actually use the added space by expanding the volume; to do that was a simple matter of using diskpart.exe (you can search Microsoft’s Knowledge Base) only took a few moments.

The incredible thing about this was that my virtual host and virtual machines was online for the entire 20 hours — with absolutely no service interruption.

This particular machine used a Dell / LSI controller; but the Promise controllers also support dynamic capacity expansion as do 3Ware controllers.  I believe the Intel Matrix pseudo RAID controller also support dynamic capacity expansion; but as with other RAID and pseudo-RAID controllers you should check the documentation specific to it and consult the manufacturer’s web site for errata and updates before proceeding.

The bottom line is Windows and RAID arrays have come a long way, and it’s quite possible that you will be able to expand the capacity of your array without taking your server down; however, if the data on the server is irreplaceable, I recommend you consider backing it up (at least the irreplaceable data).

Originally posted 2008-12-01 12:00:56.

Ubuntu – Desktop Search

Microsoft has really shown the power of desktop search in Vista and Windows 7; their newest Desktop Search Engine works, and works well… so in my quest to migrate over to Linux I wanted to have the ability to have both a server style as well as a desktop style search.

So the quest begun… and it was as short a quest as marching on the top of a butte.

I started by reviewing what I could find on the major contenders (just do an Internet search, and you’ll only find about half a dozen reasonable articles comparing the various desktop search solutions for Linux)… which were few enough it didn’t take very long (alphabetical):

My metrics to evaluate a desktop search solutions would focus on the following point:

  • ease of installation, configuration, maintenance
  • search speed
  • search accuracy
  • ease of access to search (applet, web, participation in Windows search)
  • resource utilization (cpu and memory on indexing and searching)

I immediately passed on Google Desktop Search; I have no desire for Google to have more access to information about me; and I’ve tried it before in virtual machines and didn’t think very much of it.

Begal

I first tried Beagle; it sounded like the most promising of all the search engines, and Novel was one of the developers behind it so I figured it would be a stable baseline.

It was easy to install and configure (the package manager did most of the work); and I could use the the search application or the web search, I had to enable it using beagle-config:

beagle-config Networking WebInterface true

And then I could just goto port 4000 (either locally or remotely).

I immediately did a test search; nothing came back.  Wow, how disappointing — several hundred documents in my home folder should have matched.  I waited and tried again — still nothing.

While I liked what I saw, a search engine that couldn’t return reasonable results to a simple query (at all) was just not going to work for me… and since Begal isn’t actively developed any longer, I’m not going to hold out for them to fix a “minor” issue like this.

Tracker

My next choice to experiment with was Tracker; you couldn’t ask for an easier desktop search to experiment with on Ubuntu — it seems to be the “default”.

One thing that’s important to mention — you’ll have to enable the indexer (per-user), it’s disabled by default.  Just use the configuration tool (you might need to install an additional package):

tracker-preferences

Same test, but instantly I got about a dozen documents returned, and additional documents started to appear every few seconds.  I could live with this; after all I figured it would take a little while to totally index my home directory (I had rsync’d a copy of all my documents, emails, pictures, etc from my Windows 2008 server to test with, so there was a great deal of information for the indexer to handle).

The big problem with Tracker was there was no web interface that I could find (yes, I’m sure I could write my own web interface; but then again, I could just write my own search engine).

Strigi

On to Strigi — straight forward to install, and easy to use… but it didn’t seem to give me the results I’d gotten quickly with Tracker (though better than Beagle), and it seemed to be limited to only ten results (WTF?).

I honestly didn’t even look for a web interface for Strigi — it was way too much a disappointment (in fact, I think I’d rather have put more time into Beagle to figure out why I wasn’t getting search results that work with Strigi).

Recoll

My last test was with Recoll; and while it looked promising from all that I read, but everyone seemed to indicate it was difficult to install and that you needed to build it from source.

Well, there’s an Ubuntu package for Recoll — so it’s just as easy to install; it just was a waste of effort to install.

I launched the recoll application, and typed a query in — no results came back, but numerous errors were printed in my terminal window.  I checked the preferences, and made a couple minor changes — ran the search query again — got a segmentation fault, and called it a done deal.

It looked to me from the size of the database files that Recoll had indexed quite a bit of my folder; why it wouldn’t give me any search results (and seg faulted) was beyond me — but it certainly was something I’d seen before with Linux based desktop search.

Conclusions

My biggest conclusion was that Desktop Search on Linux just isn’t really something that’s ready for prime time.  It’s a joke — a horrible joke.

Of the search engines I tried, only Tracker worked reasonably well, and it has no web interface, nor does it participate in a Windows search query (SMB2 feature which directs the server to perform the search when querying against a remote file share).

I’ve been vocal in my past that Linux fails as a Desktop because of the lack of a cohesive experience; but it appears that Desktop Search (or search in general) is a failing of Linux as both a Desktop and a Server — and clearly a reason why choosing Windows Server 2008 is the only reasonable choice for businesses.

The only upside to this evaluation was that it took less time to do than to read about or write up!

Originally posted 2010-07-06 02:00:58.

Virtualization Solutions

On windows there’s basically three commercial solutions for virtualization, and several free solutions… wait one of the commercial solutions is free (well when you buy the operating system), and the other is partially free…

  • Microsoft Virtual PC (runs on both servers and workstations)
  • Microsoft Virtual Server (runs on both servers and workstations)
  • Microsoft Hyper-V (runs only one Windows Server 2008)
  • Parallels Workstation (runs on workstations)
  • Parallels Server (runs on both servers and workstations)
  • VMware Player (runs on both servers and workstations)
  • VMware Workstation (runs on both servers and workstations)
  • VMware Server (runs on both servers and workstations)
  • Citrix (aka XenSource)

For Intel based Mac you have commercial solutions

  • Parallels Desktop
  • Parallels Server
  • VMware Fusion

And for Linux you have the following commercial solutions, and many free solutions (Xen being one of the leaders)

  • Parallels Desktop
  • Parallels Server
  • VMware Player
  • VMware Workstation
  • VMware Server
  • Citrix (aka XenSource)

And for bare metal you have

  • Parallels Server
  • VMware

 

I’m not going to go into details on any of these, I just wanted to give at least a partial list with a few thoughts.

If you’re new to virtualization, use one of the free virtualization solutions.  You can try several of them, and many of them can convert a virtual machine from another vendor’s format to it’s own, but learn what the strengths and weaknesses are of each before you spend money on a solution that might not be the best for you.

Microsoft Virtual Server has some definite performance advantages over Microsoft Virtual PC… there are some things you might lose with Virtual Server that you might want (the local interface); but Virtual Server installs on both desktop and workstation platforms, so try it.

For Mac I definitely like Parallels Desktop better than VMware Fusion; but you may not share my opinion.  VMware claims to be faster, though I certainly don’t see it.  And I might add, that if you have a decent machine you’re running virtualization software on, fast isn’t going to be the number one concern — correctness is far more important.

Also, with each of the virtualization systems, hosts, and guests there are best practices for optimizing the installation and performance.  I’ll try and write up some information I’ve put together that keep my virtual machines running well.

For the record, I run Microsoft Virtual Server 2005 R2 (64 bit) on Windows Server 2003 R2 x64 SP2, and on Windows Vista Ultimate and Business x64 SP1; works well.  And I run Parallels Desktop v3 on my Macs.

For the most part my guests are Windows XP Pro (x86) and Windows Server 2003 (x86); I don’t really need 64-bit guests (at the moment), but I do also run Ubuntu, Debian, Red Hat, Free Spire, etc linux…

Like I said, figure out your requirements, play with several of the virtualization systems and spend your money on more memory, perhaps a better processor, and stick with the free virtualization software!

Originally posted 2008-05-18 20:25:18.

File System Fragmentation

All file systems suffer from fragmentation.

Let me rephrase that more clearly in case you didn’t quite get it the first time.

ALL FILE SYSTEMS SUFFER FROM FRAGMENTATION <PERIOD>.

It doesn’t matter what file system you use one your computer, if you delete and write files it will become fragmented over time.  Some older file systems (like say FAT and FAT32) had major performance issues as the file system began to fragment, more modern file systems do not suffer as much performance lose from fragmentation, but still suffer.

If you want to argue that your writable file system doesn’t fragment, you haven’t a clue what you’re talking about, so read up on how your file system really works and how block devices work to understand why you just can’t have a file system that doesn’t fragment files or free space or both.

What can you do about fragmentation?

Well, you might not really need to do anything, modern disk drives are fast; and on a computer that’s doing many things at once the fragmentation may not have much of any impact on your performance, but after awhile you’re probably going to want to defragment your files.

The act of copying a file will generally defragment it; most modern file systems will attempt to allocate contiguous space for a file if it can (files that grow over time cannot be allocated contiguous, but they can be defragmented at their current size).

On many operating systems you can actually get programs that are designed to defragment your file system.

How often should you defragment your file system?

Well, I generally recommend you do it right after installing and updating your computer; and then any time you make major changes (large software installation, large update, etc).  But that you not do it automatically or an a routine schedule — there’s not enough benefit to that.

You can also analyze your disk (again using software) to determine how fragmented it is… and then defragment when it reaches some point that you believe represents a performance decrease.

Also, try and keep your disk clean, delete your browser cache, temporary files, duplicate files, and clutter — the less “junk” you have on your disk, the less need there will be for defragmenting.

Originally posted 2009-01-05 12:00:03.