While rolling out Windows Search Indexing I bumped in to quite a few issues and things I wanted to monitor. One of the main information I actually was after is the amount of files in the Windows Search Index. I could see this information in the Indexing Options of the system and it was constantly updating, but I was not able to find the same information per WMI, PowerShell cmdlets, Performance Counters etc.
This caused me headache cause it was clear that the index a) had to grow till it reached a fully indexed system and b) if the index-size dropped below a certain amount (once finished) I surely had an issue – I saw the database rebuilding out of no where – due to pagefile-issues or space on the partition where the index-database resided.
All of this made it clear that monitoring was inherent – but I did not want to play around with EventIDs – clearly the amount of files in the index was a way better indicator.
This caused me to write the following PowerShell script – it will invoke a command to a target system and count the files currently in the index. This has to be invoked, though the OLEDB provider of the Search Index allows remote-requests, it does not give you accurate numbers for the overall scope of the index on the system using remote requests. Getting to a point to even get this information was quite a challenge, I am certain who ever finds this here will know already, assuming you did some research. Hope it helps, though.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 | param( $TargetServer ) Invoke-Command -Computer $TargetServer -ScriptBlock { $value = 0; try{ $objConn = New-Object System.Data.OleDb.OleDbConnection("Provider=Search.CollatorDSO;Extended Properties='Application=Windows'") $sqlCommand = New-Object System.Data.OleDb.OleDbCommand("GROUP ON workid [0] AGGREGATE COUNT() as 'Total' OVER (SELECT workid FROM systemindex)") $sqlCommand.Connection = $objConn $objConn.open() $reader = $sqlCommand.ExecuteReader() try{ $ignore = $reader.Read() #will cause an error but we can ignore it... without this we won't have data } catch {} $value = $reader[2] } catch {} $XML = "<prtg>" $XML += "<result>" $XML += "<channel>indexed files count</channel>" $XML += "<value>$value</value>" $XML += "</result>" $XML += "</prtg>" Function WriteXmlToScreen ([xml]$xml) #just to make it clean XML code... { $StringWriter = New-Object System.IO.StringWriter; $XmlWriter = New-Object System.Xml.XmlTextWriter $StringWriter; $XmlWriter.Formatting = "indented"; $xml.WriteTo($XmlWriter); $XmlWriter.Flush(); $StringWriter.Flush(); Write-Output $StringWriter.ToString(); } WriteXmlToScreen $XML; } |
An update to this or what I learned
Of course you wouldn’t just monitor the amount of indexed files with the script above. You likely have a dedicated drive / partition where the index resides. You definitive want to monitor the used or free drive space there as well. What you will discover especially in the beginning but as well later if many files are moved or copied to the server, is that the Windows Search Index database will grow after the indexing of huge amounts of files is done and shrink again. As far as I understand this, there is some maintenance and deduplication going on.
On a server with about 14 million files that took about 1 week to index, I ended up with a 250 GB index database, the second it finished it took another 12 hours while the database grew another 100 GB and then shrunk back to actually 200 GB.
Don’t let those numbers scare you, we talk about 10 TB (terra byte) of data in those 14 million files. This is quite a bit. Most other file servers won’t have such huge databases and you won’t see such huge increases while the index database is doing some kind of clean up, nor will it take a week for the initial index to finish.
What I wanted to show with this is simply that you really want to monitor all those information and keep a close eye on it. I saw the database make huge jumps in size in very very short periods of time. If the drive then is full your index possibly would get corrupted and Windows will start from zero again. You want to avoid this. Once the index is finished and no huge incoming file operations take place you won’t see to many jumps anymore. It will calm down. But still, always make sure you have enough space on the partition where the index database resides and proper monitoring on it to be able to react quickly (going as far as automating a service stop for the Windows Search index while the space goes down to avoid database corruption).