Entries Tagged as 'Windows Server'

File System Fragmentation

All file systems suffer from fragmentation.

Let me rephrase that more clearly in case you didn’t quite get it the first time.

ALL FILE SYSTEMS SUFFER FROM FRAGMENTATION <PERIOD>.

It doesn’t matter what file system you use one your computer, if you delete and write files it will become fragmented over time.  Some older file systems (like say FAT and FAT32) had major performance issues as the file system began to fragment, more modern file systems do not suffer as much performance lose from fragmentation, but still suffer.

If you want to argue that your writable file system doesn’t fragment, you haven’t a clue what you’re talking about, so read up on how your file system really works and how block devices work to understand why you just can’t have a file system that doesn’t fragment files or free space or both.

What can you do about fragmentation?

Well, you might not really need to do anything, modern disk drives are fast; and on a computer that’s doing many things at once the fragmentation may not have much of any impact on your performance, but after awhile you’re probably going to want to defragment your files.

The act of copying a file will generally defragment it; most modern file systems will attempt to allocate contiguous space for a file if it can (files that grow over time cannot be allocated contiguous, but they can be defragmented at their current size).

On many operating systems you can actually get programs that are designed to defragment your file system.

How often should you defragment your file system?

Well, I generally recommend you do it right after installing and updating your computer; and then any time you make major changes (large software installation, large update, etc).  But that you not do it automatically or an a routine schedule — there’s not enough benefit to that.

You can also analyze your disk (again using software) to determine how fragmented it is… and then defragment when it reaches some point that you believe represents a performance decrease.

Also, try and keep your disk clean, delete your browser cache, temporary files, duplicate files, and clutter — the less “junk” you have on your disk, the less need there will be for defragmenting.

Originally posted 2009-01-05 12:00:03.

Microsoft Updates

I’ve got a new pet-peeve (like a had a shortage of them before)…

nVidia has been coming out with display updates for their video cards for Vista about once per month (OK — a little less often than that); and Microsoft has been dutifully pushing down certified drivers to users.

First, the big problem I have with the nVidia driver for my 9800s is that I periodically have the machine freeze and get a message that the display driver stopped responding (but has recovered)… maybe nVidia should be concentrating on fixing that issue and hold off on updates until there’s really some substantial progress [but that might negatively impact them re-naming old tehcnology and trying to sell it as something new].

OK — I digressed… but like I said, it’s a new pet-peeve, and I want to revel in it.

The really annoying thing is that every time Microsoft download and installs a new video driver the system resizes all my open windows and rearranges the icons (shortcuts) on my desktop…

Now perhaps this is only because I have a multiple display system… but reguardless you’d think the children in Redmond might have considered storing the previous state of windows BEFORE activating the new video driver and restoring it afterwards — after all, they are concerned with user experience, RIGHT?

RIGHT… I think the phase would be “experience THIS!”

Microsoft has come a long way in the last few years in making computers easier to use, and easier to maintain… but they (Microsoft) still fails to actually have people who use computers design feature for them… and that’s why using Windows has always felt like it was held together by chewing gum and string — BECAUSE IT IS.

I could do with one less version of Internet Explorer and a bit more work on polishing the overall user experience… and why all these “major” upgrades???  Why not just a continuous stream of improvements to each and every part of the system???

Originally posted 2009-08-22 01:00:10.

Microsoft Hyper-V Server 2008

Last week Microsoft released the FREE version of the Microsoft Hyper-V Server 2008; this is a scaled down Server 2008 with Hyper-V install that allows you to run a light-weight virtualization host (much like many of the competitors in the virtualization world).

While there are some limits on this version — maxium 4 processors [don’t confuse that with cores; I think Microsoft counts physical processors not cores] and 32GB of memory.

You can get details on Hyper-V Server 2008 here:
http://www.microsoft.com/servers/hyper-v-server/default.mspx.

And you can download Hyper-V Server 2008 here:
http://www.microsoft.com/downloads/details.aspx?FamilyId=6067CB24-06CC-483A-AF92-B919F699C3A0&displaylang=en.

Originally posted 2008-10-16 11:08:09.

Desktop Search

Let me start by saying that Windows Desktop Search is a great addition to Windows; and while it might have taken four major releases to get it right, for the most part it works and it works well.

With Windows Server 2008, Windows Vista, and Windows 7 Desktop Search is installed and enabled by default; and it works in a federated mode (meaning that you can search from a client against a server via the network).

Desktop Search, however, seems to have some issues with junction points (specifically in the case I’ve seen — directory reparse, or directory links).

The search index service seems to do the right thing and not create duplicates enteries when both the parent of the link and the target are to be indexed (though I don’t know how you would control whether or not the indexer follows links in the case where the target wouldn’t normally be indexed).

The search client, though, does not seem to properly provide results when junction points are involved.

Let me illustrate by example.

Say we have directory tree D1 and directory tree D2 and both of those are set to be indexed.  If we do a search on D1 it produces the expected results.  If we do a search on D2 it produces the expected results.

Now say we create a junction point (link) to D2 from inside D1 called L1.  If we do a search on L1 we do not get the same results as if we’d searched in D2.

My expectation would be that the search was “smart” enough to do the search against D2 (taking the link into consideration) and then present the results with the path altered to reflect the link L1.

I consider this a deficiency; in fact it appears to me to be a major failing since the user of information shouldn’t be responsible for understanding all the underlying technology involved in organizing the information — he should just be able to obtain the results he expects.

It’s likely the client and the search server need some changes in order to accommodate this; and I would say that the indexer also needs a setting that would force it to follow links (though it shouldn’t store the same document information twice).

If this were a third party search solution running on Windows my expectation would be that file system constructs might not be handled properly; but last time I checked the same company wrote the search solution, the operating system, and the file system — again, perhaps more effort should be put into making things work right, rather than making things [needlessly] different.

Originally posted 2010-01-22 01:00:57.

Linux – Desktop Search

A while ago I published a post on Desktop Search on Linux (specifically Ubuntu).  I was far from happy with my conclusions and I felt I needed to re-evaluate all the options to see which would really perform the most accurate search against my information.

Primarily my information consists of Microsoft Office documents, Open Office documents, pictures (JPEG, as well as Canon RAW and Nikon RAW), web pages, archives, and email (stored as RFC822/RFC2822 compliant files with an eml extension).

My test metrics would be to take a handful of search terms which I new existed in various types of documents, and check the results (I actually used Microsoft Windows Search 4.0 to prepare a complete list of documents that matched the query — since I knew it worked as expected).

The search engines I tested were:

I was able to install, configure, and launch each of the applications.  Actually none of them were really that difficult to install and configure; but all of them required searching through documentation and third party sites — I’d say poor documentation is just something you have to get used to.

Beagle, Google, Tracker, Pinot, and Recoll all failed to find all the documents of interest… none of them properly indexed the email files — most of the failed to handle plain text files; that didn’t leave a very high bar to pick a winner.

Queries on Strigi actually provided every hit that the same query provided on Windows Search… though I have to say Windows Search was easier to setup and use.

I tried the Neopomuk (KDE) interface for Strigi — though it just didn’t seem to work as well as strigiclient did… and certainly strigiclient was pretty much at the top of the list for butt-ugly, user-hostile, un-intuitive applications I’d ever seen.

After all of the time I’ve spent on desktop search for Linux I’ve decided all of the search solutions are jokes.  None of them are well thought out, none of them are well executed, and most of them out right don’t work.

Like most Linux projects, more energy needs to be focused on working out a framework for search than everyone going off half-cocked and creating a new search paradigm.

The right model is…

A single multi-threaded indexer running in the background indexing files according to a system wide policy aggregated with user policies (settable by each user on directories they own) along with the access privileges.

A search API that takes the user/group and query to provide results for items that the user has (read) access to.

The indexer should be designed to use plug-in modules to handle particular file types (mapped both by file extension, and by file content).

The index should also be designed to use plug-in modules for walking a file system and receiving file system change events (that allows the framework to adapt as the Linux kernel changes — and would support remote indexing as well).

Additionally, the index/search should be designed with distributed queries in mind (often you want to search many servers, desktops, and web locations simultaneously).

Then it becomes a simple matter for developers to write new/better indexer plug-ins; and better search interfaces.

I’ve pointed out in a number of recent posts that you can effective use Linux as a server platform in your business; however, it seems that if search is a requirement you might want to consider ponying up the money for Microsoft Windows Server 2008 and enjoy seamless search (that works) between your Windows Vista / Windows 7 Desktops and Windows Server.

REFERENCES:

Ubuntu – Desktop Search

Originally posted 2010-07-16 02:00:19.

bootrec.exe

Bootrec.exe, available as part of repair from the command line can resolve a number of start up issues on Windows.  It comes in quite handy for replacing the master boot record (MBR) and boot loader (a good way to remove a multi-boot manager like GRUB).

 Be sure you understand what you’re doing it you choose to use it.

 Use the Bootrec.exe tool in the Windows Recovery Environment to troubleshoot and repair startup issues in Windows

Originally posted 2013-11-13 17:00:09.

Windows 6 Service Pack 2

It’s out… it’s been in BETA for quite some time.

Just so you’re clear; Windows 6 covers all the Vista family and the Server 2008 family, and there’s an installer for 32-bit and one for 64-bit; there’s also a DVD image that you can install either from.

You can find a number of articles on the web telling you all about what was originally supposed to be in SP2, and what ended up in it… other than Bluetooth 2.1 and Blu-Ray support there isn’t that much that caught my eye as for “features”.

The big thing you will notice is that this makes Vista noticably faster… and includes the compcln.exe tool that allows you to remove previous component versions (saving disk space — of course once you do so, you cannot go back to previous versions… but if your machine is stable after SP2 you probably wouldn’t want to).

You must have SP1 installed first (Server 2008 comes with SP1 “pre-installed”).

You can access the Microsoft TechNet article via the link below and download the file(s) you desire.  At the moment SP2 is not included in automatic updates, but it will likely be pushed out soon.

http://technet.microsoft.com/en-us/windows/dd262148.aspx

Originally posted 2009-06-07 11:00:22.

Computer Tid Bits

I haven’t sent one of these tid bit emails out in a long long time — this is just a collection of little points that you might find comes in handy.

Server 2008 is indeed out and available. I think I’m going to wait a few months (and I’m just about out of funds for MSFT store purchase, so doubtful I can get a copy for anyone else — I’ll probably do the MSDN OS subscription again). Hyper-V has not shipped as of yet.

Service Pack 1 for Vista can be downloaded or you’ll get it from Windows Update. If you’re updating more than a single machine, download the whole thing (Windows Update will swamp your connection). There are separate packs for 32-bit and 64-bit (you may need both if you have both machines). Also, copy the update file to the local disk (it will need elevated privileges to install).

Virtual Server 2005 R2 can be installed on XP, XP-64, Vista-32, or Vista-64. The management interface requires IIS, so that’s a little different with PWS version on non-server platforms. If you have VS installed on a server, you should be able to manage _all_ of your installations from one management interface (though Vista doesn’t make that easy).

Google GMail allows you to host your domains for email there for free… you basically get GMail accounts in your own domain. I’ve moved my mail services over there for the time being (I still archive all my email on my own server at home, but the active send/receive is done via GMail).

Parallels is coming out with a new server (64 & 32 bit) to compete with Hyper-V; I looked at the beta (definitely a beta, but useable), they may be able to get some of the market share — but my guess is they’ll get the share from VMware (I didn’t care for the Mac-ish look of the product on Windows).

2.5″ SATA disk drives continue to fall in price; Seagate 250GB drives were $104 @ Fry’s, and they still had some on the shelf on Monday!!!

Intel hasn’t release the most of the 45nm processor family yet; the older Core2 dual and quad processor continue to be a great buy. Remember that really none of the current Intel chip sets take advantage of the higher transfers the newer processors are capable of (well the X38, but that’s supposed to have major issues) — so you might want to wait for the next generation of Intel chips and motherboards to hit the market. FYI: Intel delayed the release because AMD missed their ship dates… their new cores had some rather serious flaws

Notebook and desktop memory are nearly on par with each other. You can purchase 2 x 2GB for under $100 (easily — even the really fast memory). $60 is actually the low price and $80 get’s you high quality with heat spreaders (notebook memory doesn’t have heat spreaders — no room). 2 x 1GB can be purchased for $40!!!

Originally posted 2008-04-01 12:58:23.

Microsoft WebsiteSpark

A program that offers visibility, support and software for professional Web Developers and Designers

If you company has ten or fewer employees, has been around for less than three years, and you provide services, support, and hosting to businesses that develop web sites and applications you might qualify for deeply discounted Windows Web Server and SQL Server Web Edition (like free or nearly free).

You can get more information at the Microsoft® WebsiteSpark page…

Microsoft® WebsiteSpark

Originally posted 2009-11-16 01:00:09.

Hyper-V Server

With the release of Windows Server 2008 Microsoft made a huge step forward in releasing thin, high-performance hyper-visor for machine virtualization – Hyper-V.

Microsoft has also baited the market by offering a free version of Windows Server 2008 specifically designed to be a virtualization host; Hyper-V Server.

I decide to play with Windows Server 2008 with Hyper-V and Hyper-V Server to get a feel for what it could do.

Installation is a snap; much the same as Vista.

With Windows Server 2008 with Hyper-V everything goes very smoothly and just works.  You can use the Hyper-V manager to setup virtual machines, run them, stop them, etc.  But one thing you want to while you have Windows Server 2008 up and running is figure out everything you need to do to remotely connect to manage Hyper-V and Server 2008 from your workstation because Hyper-V server isn’t going to allow you to do much from the console.

To say it’s a little complicated to get remote Hyper-V management working is an understatement; after I figured it out I found a tool that can help automate the setup — makes like much easier.

The one thing I never got working from Vista x64 was remote management of Windows Server 2008 – and you really need that as well (remember you don’t get much capability from the console).  I’ll probably play with that a little more; and certainly I’ll get it working before I deploy any Hyper-V servers (it’s not a huge problem if you have a Windows Server 2008 machine already, remote management of other Windows Server 2008 boxes just works).

Now after the headache of getting everything configured properly it was time to put Hyper-V through it’s paces.

First task, migrate a machine over from Virtual Server 2005 R2 SP2… piece of cake — copy over the VHD files, create a machine, hookup the disks (back track since Hyper-V seems to have a fairly set directory format for machines and disks — so if you create a new machine on Hyper-V first you’ll see the layout).  Boot the machine, connect, remove the virtual machine additions, reboot, install the new virtual machine files — asks to update the HAL (say yes), reboot, and finally install the new virtual machine files, reboot, re-generate the SID and rename the machine (I still have the old one, and I don’t want confusion)… and everything works great.  Shutdown the machine, add a second processor, start it up… and a dual processor virtual machine is born.

I migrated over 32-bit XP Professional; did a test install of 64-bit Server 2003… and every thing worked just fine.

Don’t get carried away just yet.

There’s a couple gotchas with this.

  • To effectively use the free Hyper-V Server you either need a Windows Server 2008 (full install) or you need to get the remote tools working from your workstation; that’s non-trivial.
  • To run Hyper-V Server or Windows Server 2008 with Hyper-V you need a machine with hardware virtualization and execute disable (which really isn’t that uncommon these days, just make sure your BIOS has those features enabled).
  • Once you migrate a machine to Hyper-V there’s no automated way to go back to Virtual Server 2005 R2 SP2 (sure you can probably do it — but it’s going to be a pain).
  • To get performance out of Hyper-V you really need to use SCSI virtual disks; right now Microsoft doesn’t support booting from SCSI disks in Hyper-V since they only support the para-virtualized SCSI interface.  So to get performance you have to have an IDE boot disk and run off SCSI disks (not exactly a common installation, so you probably won’t be converting any physical machines like that — and seems like it’s a nightmare just waiting to unfold).

Fortunately I’m not in a huge hurry to move to Hyper-V; I’m fairly certain since it’s a corner stone of Microsoft’s push to own the virtual infrastructure market I suspect we’ll see the issues that prevent it from being all that it can be resolved quickly.

And I’ll close with an up-note… WOW — the performance was very impressive… I really wish I had a test machine with lots of spindles to see what kind of load I could realistically put on it.

Originally posted 2008-11-15 08:00:52.