Update to my “Robust Posh Script to Delete Files” Script   Leave a comment

I have made a modification to my “Robust Posh Script to Delete Files” scripts so that they will only delete actual files in the specified path, and not folders. It does this by using the statement “if ((Test-Path $_.fullname -pathtype leaf)” to check whether the object being considered is a file. HERE is the link to the updated post and scripts.

Posted February 2, 2016 by Norm Enger in Microsoft SQL Server, Powershell

Robust Posh Script to Delete Files, Checks for Locked Files First   Leave a comment

UPDATE 2/2/2016: I have made a modification to the scripts so that they will only delete actual files in the specified path, and not folders. It does this by using the statement “if ((Test-Path $_.fullname -pathtype leaf)” to check whether the object being considered is a file.

I am back blogging for the New Year 2016! Happy New Year, everyone!

Before we begin, here are source files for the below scripts:

PowerShell Script
SQL Server Agent Script (incorporating the PowerShell Script)

Here is a nice Posh (PowerShell) script I have written to delete from the specified directory files that are older than the number of days specified. The script is robust in the following ways:

  • It checks if a file that is a candidate to delete is locked (and therefore cannot be deleted). If it finds those files, output is generated to let you know the number of files that could not be deleted and lists the names of up to the first 10 files.

Here is an example of the output (formatted for readability):

The error information returned by PowerShell is: 

Error: A total of 1 File(s) Could Not Be Deleted. 
See the following output: 

Error: The Locked File "MyBackupFile.BKP" could not be deleted.
Note: 0 additional locked files could not be deleted. 

Total # Files Older Than (34) Days Successfully Deleted 
From Directory "B:\Backups\myBackups\": 1458.

  • If all deletions are successful, the output looks similar to the following:
Total # Files Older Than (34) Days Successfully Deleted 
From Directory "B:\Backups\myBackups\": 1458.

  • The script can be used to delete from local directories or network shares.
  • I have designed the script to be run as stand-alone Posh script, or as a SQL Server Agent job, which is scheduled to run on a regular basis. I have successfully tested the SS Agent job on all versions of SQL Server from SQL 2008 and above (up through SQL 2014).
  • Output from the PowerShell script is appended to the SQL Server Agent job’s output history, for your later examination.

Note: For the SQL Server Agent job to work, the Windows service account that the SQL Server Agent runs under must have delete rights in the specified directory.

Here is the stand-alone PowerShell script, and an example of a scripted-out SQL Server Agent job.

The only edits to the scripts you may need to make are as follows:

PowerShell script:

#Set the following to the days to keep value
#and the directory value
$daysToKeep = 34
$directory  = 'B:\Backups\myBackups\'

SQL Agent Job script:

SET @BkupDir = 'B:\Backups\myBackups\' --include trailing backslash

#Set the following to the days to keep value
#and the directory value
$daysToKeep = 34

PowerShell script:


#Set the following to the days to keep value
#and the directory value
$daysToKeep = 34
$directory  = 'B:\Backups\myBackups\'

if (Test-Path $directory)
    {
	$olderThanDt = (Get-Date).AddDays(-$daysToKeep)
	$cntDel = 0
	$cntLocked = 0
	$strMsgFilesRemovedBase = "Total # Files Older Than ({0}) Days Successfully Deleted From Directory `"$directory`"`:" -f $daysToKeep
	$strLockedList = ""
	
	gci $directory | foreach {
        $curFullName = $_.Name
        $curLastWrite = $_.LastWriteTime
        
        if ((Test-Path $_.fullname -pathtype leaf) -and ($curLastWrite -lt $olderThanDt))
        {
        try 
         { [IO.File]::OpenWrite($_.FullName).close();
            if ($curLastWrite -lt $olderThanDt)
                {
                $cntDel += 1
                Remove-Item $_.fullname
                }
         }
        catch 
         {
            $cntLocked += 1
            if ($cntLocked -lt 11)
            {
                $strLockedList += "`nError: The Locked File `"$curFullName`" could not be deleted."}
            }
	     }
	}
	
    if ($cntLocked -gt 0)
    {
        $cntLockedAddl = $cntLocked - 10
		if ($cntLockedAddl -lt 0)
		{
			$cntLockedAddl = 0
		}
        $strLockedList += "`nNote: $cntLockedAddl additional locked files could not be deleted."
    }

	if ($cntDel -gt 0)
	    {write-output "$strMsgFilesRemovedBase $cntDel."}
	else
	    {write-output "$strMsgFilesRemovedBase None Found."}

    if ($cntLocked -gt 0)
    {
        throw "Error: A total of $cntLocked File(s) Could Not Be Deleted. See the following output: $strLockedList `n$strMsgFilesRemovedBase $cntDel."
    }

}
else
    {throw "Error: Specified directory could not be accessed or does not exist."}

SQL Agent Job script:


USE [msdb]
GO

BEGIN TRANSACTION

DECLARE @BkupDir NVARCHAR(1000), @cmd NVARCHAR(MAX), @cmdVar NVARCHAR(MAX)
SET @BkupDir = 'B:\Backups\myBackups\' --include trailing backslash

DECLARE @ReturnCode INT
SELECT @ReturnCode = 0
IF NOT EXISTS (SELECT name FROM msdb.dbo.syscategories WHERE name=N'[Uncategorized (Local)]' AND category_class=1)
BEGIN
EXEC @ReturnCode = msdb.dbo.sp_add_category @class=N'JOB', @type=N'LOCAL', @name=N'[Uncategorized (Local)]'
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback

END

DECLARE @jobId BINARY(16)

SET @cmdVar = N'#Set the following to the days to keep value
#and the directory value
$daysToKeep = 34
$directory  = ''' + @BkupDir + N'''

if (Test-Path $directory)
    {
	$olderThanDt = (Get-Date).AddDays(-$daysToKeep)
	$cntDel = 0
	$cntLocked = 0
	$strMsgFilesRemovedBase = "Total # Files Older Than ({0}) Days Successfully Deleted From Directory `"$directory`"`:" -f $daysToKeep
	$strLockedList = ""
	
	gci $directory | foreach {
        $curFullName = $_.Name
        $curLastWrite = $_.LastWriteTime
        
        if ((Test-Path $_.fullname -pathtype leaf) -and ($curLastWrite -lt $olderThanDt))
        {
        try 
         { [IO.File]::OpenWrite($_.FullName).close();
            if ($curLastWrite -lt $olderThanDt)
                {
                $cntDel += 1
                Remove-Item $_.fullname
                }
         }
        catch 
         {
            $cntLocked += 1
            if ($cntLocked -lt 11)
            {
                $strLockedList += "`nError: The Locked File `"$curFullName`" could not be deleted."}
            }
	     }
	}
	
    if ($cntLocked -gt 0)
    {
        $cntLockedAddl = $cntLocked - 10
		if ($cntLockedAddl -lt 0)
		{
			$cntLockedAddl = 0
		}
        $strLockedList += "`nNote: $cntLockedAddl additional locked files could not be deleted."
    }

	if ($cntDel -gt 0)
	    {write-output "$strMsgFilesRemovedBase $cntDel."}
	else
	    {write-output "$strMsgFilesRemovedBase None Found."}

    if ($cntLocked -gt 0)
    {
        throw "Error: A total of $cntLocked File(s) Could Not Be Deleted. See the following output: $strLockedList `n$strMsgFilesRemovedBase $cntDel."
    }

}
else
    {throw "Error: Specified directory could not be accessed or does not exist."}'

EXEC @ReturnCode =  msdb.dbo.sp_add_job @job_name=N'Posh - Clean Up Old Backup Files', 
		@enabled=1, 
		@notify_level_eventlog=0, 
		@notify_level_email=0, 
		@notify_level_netsend=0, 
		@notify_level_page=0, 
		@delete_level=0, 
		@description=N'No description available.', 
		@category_name=N'[Uncategorized (Local)]', 
		@owner_login_name=N'sa', @job_id = @jobId OUTPUT
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback
/****** Object:  Step [Delete old files]    Script Date: 10/13/2015 8:18:11 AM ******/
EXEC @ReturnCode = msdb.dbo.sp_add_jobstep @job_id=@jobId, @step_name=N'Delete old files', 
		@step_id=1, 
		@cmdexec_success_code=0, 
		@on_success_action=1, 
		@on_success_step_id=0, 
		@on_fail_action=2, 
		@on_fail_step_id=0, 
		@retry_attempts=0, 
		@retry_interval=0, 
		@os_run_priority=0, @subsystem=N'PowerShell', 
		@command=@cmdVar, 
		@database_name=N'master', 
		@flags=32
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback
EXEC @ReturnCode = msdb.dbo.sp_update_job @job_id = @jobId, @start_step_id = 1
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback
EXEC @ReturnCode = msdb.dbo.sp_add_jobschedule @job_id=@jobId, @name=N'1', 
		@enabled=1, 
		@freq_type=8, 
		@freq_interval=32, 
		@freq_subday_type=1, 
		@freq_subday_interval=0, 
		@freq_relative_interval=0, 
		@freq_recurrence_factor=1, 
		@active_start_date=20151008, 
		@active_end_date=99991231, 
		@active_start_time=233000, 
		@active_end_time=235959
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback
EXEC @ReturnCode = msdb.dbo.sp_add_jobserver @job_id = @jobId, @server_name = N'(local)'
IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback
COMMIT TRANSACTION
GOTO EndSave
QuitWithRollback:
    IF (@@TRANCOUNT > 0) ROLLBACK TRANSACTION
EndSave:

GO

Enjoy!

Posted January 1, 2016 by Norm Enger in Microsoft SQL Server, Powershell

Query to Display Recent Data and TLog File Autogrowth and Autoshrink Events   2 comments

Here is a query to read the default trace on your SQL instance and produce a report of recent Data and TLog file autogrowth and autoshrink events that have occurred for that SQL instance. The query works for SQL 2005 and above, and there are no edits required.

The query is a modified version of the one used by the SQL 2008 R2 Management Studio’s “Standard Report” named “Disk Usage,” which has a section that displays this data (Object Explorer –> right click on database –> Reports –> Standard Reports –> “Disk Usage”).

Just open a query window connected to the SQL instance you wish to report on and run the query. This version displays data for all databases, including the database name, the date of the autogrowth event, the number of milliseconds it took, and the size of the growth in MB.

Here is the script. To easily copy the code, hover your mouse over the below section and click the “copy to clipboard” button in the tool bar that appears at the top right. If the copy button is not available, then click the “view source” button, press Ctrl + A to select all, and Ctrl + C to copy. Then paste into your favorite editor.

--NE - 6/17/2011

--Query to display recent Data and TLog file autogrowth and autoshrink events
--for all databases on the instance. Based on query used by the SSMS Standard
--Report named "Disk Usage" which has a section that displays this data. Data
--is retrieved from the default trace.

--Works for SQL 2005 and above.

USE [master]
GO

BEGIN TRY
	IF (SELECT CONVERT(INT,value_in_use) FROM sys.configurations WHERE NAME = 'default trace enabled') = 1
	BEGIN 
		DECLARE @curr_tracefilename VARCHAR(500);
		DECLARE @base_tracefilename VARCHAR(500);
		DECLARE @indx INT;

		SELECT @curr_tracefilename = path FROM sys.traces WHERE is_default = 1;
		SET @curr_tracefilename = REVERSE(@curr_tracefilename);
		SELECT @indx  = PATINDEX('%\%', @curr_tracefilename) ;
		SET @curr_tracefilename = REVERSE(@curr_tracefilename) ;
		SET @base_tracefilename = LEFT( @curr_tracefilename,LEN(@curr_tracefilename) - @indx) + '\log.trc'; 
		SELECT
			--(DENSE_RANK() OVER (ORDER BY StartTime DESC))%2 AS l1,
			ServerName AS [SQL_Instance],
			--CONVERT(INT, EventClass) AS EventClass,
			DatabaseName AS [Database_Name],
			Filename AS [Logical_File_Name],
			(Duration/1000) AS [Duration_MS],
			CONVERT(VARCHAR(50),StartTime, 100) AS [Start_Time],
			--EndTime,
			CAST((IntegerData*8.0/1024) AS DECIMAL(19,2)) AS [Change_In_Size_MB]
		FROM ::fn_trace_gettable(@base_tracefilename, default)
		WHERE 
			EventClass >=  92
			AND EventClass <=  95
			--AND ServerName = @@SERVERNAME
			--AND DatabaseName = 'myDBName'  
		ORDER BY DatabaseName, StartTime DESC;  
	END     
	ELSE    
		SELECT -1 AS l1,
		0 AS EventClass,
		0 DatabaseName,
		0 AS Filename,
		0 AS Duration,
		0 AS StartTime,
		0 AS EndTime,
		0 AS ChangeInSize 
END TRY 
BEGIN CATCH 
	SELECT -100 AS l1,
	ERROR_NUMBER() AS EventClass,
	ERROR_SEVERITY() DatabaseName,
	ERROR_STATE() AS Filename,
	ERROR_MESSAGE() AS Duration,
	1 AS StartTime, 
	1 AS EndTime,
	1 AS ChangeInSize 
END CATCH

Posted June 21, 2011 by Norm Enger in Microsoft SQL Server

Posted June 21, 2011 by Norm Enger in Microsoft SQL Server

Publishing Source Code to WordPress Using Windows Live Writer 2011   2 comments

My previous blog post “Publishing Code Snippets on My WordPress Blog” explained how I used a read-only textarea to publish source code. Alas, WordPress has started stripping out <textarea> tags when you publish to their site or edit existing posts. They explain that this is for security reasons at http://en.support.wordpress.com/code/.

The above WordPress article also references another article at http://en.support.wordpress.com/code/posting-source-code/ which explains that they have a “sourcecode” “short code” available which can help with publishing source code blocks or code snippets. Unfortunately, that article is muddled in how it presents how to insert the short codes, and also it does not offer any help for Windows Live Writer 2011 users.

After experimenting a bit, I have discovered how to easily use the short codes from within Live Writer, in combination with the <PRE> tag, and what to expect when using it in this editor.

The basic way to include a code block is as follows. Note that the short code tags are enclosed in square brackets rather than angle brackets. Also you will have to type this code in HTML view.

image

Please note that when you go back to the Edit view in Live Writer, the above “sourcecode” lines will appear as text. Do not edit these out. When you publish, the WordPress site will accept these and will not display them, but will interpret them correctly and display your code with the fancy highlighting.

A second important note: when in the Edit view in Live Writer, you will see your snippits appearing as plain text, without the syntax coloring. Don’t try to correct this. Once you publish the post to WordPress, it will appear as expected, with the syntax highlighting, toolbar, etc.

Please note that you must also include the <PRE> tag because if you do not, WordPress will remove the line breaks and tabs from your snippets when you publish.

In my example, the language=”sql” parameter specifies which language I want syntax highlighting for, in my case, SQL. The wraplines=”false” parameter ensures that a horizontal scroll bar will appear if I have longer lines of code. This particular option was needed with my WordPress theme, but you may need to experiment with this. The gutter=”false” parameter prevents line numbering in the code block. The full explanation of these and other parameters is explained in greater detail at http://en.support.wordpress.com/code/posting-source-code/.

Happy blogging!

Posted June 20, 2011 by Norm Enger in Microsoft SQL Server

Display Current Backup/Restore Progress Using DMVs   1 comment

Here is a script gleaned from an article at http://www.mssqltips.com/tip.asp?tip=2343, with some minor modifications of mine, which very nicely allows you to monitor SQL backup or restore progress, including estimated completion times, based on data from system DMVs. This script works for SQL 2005 and above.

What is nice also is that it not only reports on native SQL backups and restores, but also third-party backups and restores performed by tools such as Quest SQL LiteSpeed or the Simpana CommVault SQL backup agent.

Here is the script. To easily copy the code, hover your mouse over the below section and click the “copy to clipboard” button in the tool bar that appears at the top right. If the copy button is not available, then click the “view source” button, press Ctrl + A to select all, and Ctrl + C to copy. Then paste into your favorite editor.

--Display current backup/restore progress using DMVs.
--SQL 2005 and above
--From http://www.mssqltips.com/tip.asp?tip=2343

USE master

SELECT
	session_id as SPID,
	CONVERT(VARCHAR(50),start_time,100) AS start_time,
	percent_complete,
	CONVERT(VARCHAR(50),dateadd(second,estimated_completion_time/1000, getdate()),100) as estimated_completion_time,
	command, a.text AS Query
FROM sys.dm_exec_requests r
CROSS APPLY sys.dm_exec_sql_text(r.sql_handle) a
WHERE r.command LIKE 'BACKUP%' OR r.command LIKE 'RESTORE%'

Posted June 20, 2011 by Norm Enger in Microsoft SQL Server

Posted June 20, 2011 by Norm Enger in Microsoft SQL Server

Using Excel 2007 and Excel 2010 to Analyze Perfmon Captures in the Form of .csv Comma-Delimited Counter Logs   9 comments

Here is an explanation of how to easily use Excel 2007 and Excel 2010 to analyze performance captures saved to counter logs via Windows’ built-in perfmon utility. This article assumes that you have already collected whatever SQL or Windows or other performance counters you wish to analyze, and that you have captured them to counter logs in the form of one or more comma-delimited .csv files.

We will look at how to easily open a .csv file, apply a few formatting changes, and use the very handy Pivot Chart functionality built into Excel to graph your counters. Finally, we will demonstrate how to use date filters to drill deeper into specific time ranges, in order to view the graph data in greater detail.

The screen shots and steps below are for Excel 2007, but they should also work for Excel 2010.

Note that in the process below, Excel may not by default select the line chart type you prefer, for example one with data points displayed. If this is the case, right-click on an open area of the Pivot Table chart and select “Change Chart Type.” Select the left-most line chart type (the most basic line chart), and click the option at the bottom to make that your default type. See Excel help for more info.

Let’s get started.

1. Double-click on the .csv counter log file. The .csv extension should already be associated, so the file will automatically open in Excel.

image

2. Make some formatting changes.

a. Change the text in the A1 cell from “(PDH-CSV 4.0…” to simply “DateTime.”

image

image

b. Remove row 2 by right clicking on the row number –> “Delete.” This row, containing the first row of data, is typically a junk row.

image

c.  Highlight the entire “A” column by clicking on the column header, then right-click anywhere in the column –> Format Cells…

image

d. On the Number tab, select the “Date” category and the “3/14/01 1:30 PM” type, and click OK.

image

e. Press Ctrl + Home to get the focus back on the A1 cell.

image

f. These steps are repetitive for every .csv file you will analyze, so I recommend that you automate steps a. through e. using an Excel macro, by having Excel record your actions, and then you will assign a keyboard shortcut to your macro. Please refer to the Excel help for how to do this.

Note that I have found using the Excel macro functionality for steps beyond this step unworkable.

3. Create your Pivot Chart

a. Switch to the “Insert” ribbon, click on the down-arrow on the “Pivot Table” button, and select “Pivot Chart.”

image

b. The Pivot Chart dialog “auto-magically” selects the correct Table/Range of data to analyze. Very nice functionality! Click OK.

image

c. In the PivotTable Field List box, click and drag “DateTime” down to the “Axis Fields (Categories)” pane below.

image

d. In this example, we are going to first analyze the total processor time. In the PivotTable Field List box, scroll down and find the “Processor(_Total)\% Processor Time” counter, and click and drag it to the “Values” pane below.

image

e. Notice that in this instance, Excel incorrectly guessed that I want to summarize the data by “Count of….” What I actually want is for Excel to summarize the data by “Sum of….” Sometimes Excel correctly guesses this, sometimes not.

To correct this, right-click on the the Excel header cell labeled “Count of \\mySQLInstance\Processor(_Total)\% Processor Time” –> “Summarize Data By” –> and select “Sum.”

You may find this step necessary for other counters that you want to graph later as well.

image    

4. Format and size the graph to your liking.

a. You will now have a basic graph that looks something like this.

image

b. For this chart, I will change the Max value shown on the left axis from “120” to “100.” Right-click on the left axis area –> Format Axis. Change the “Maximum” option under “Axis Options” from “Auto” to “Fixed” and type in “100.” Then click Close.

image

image

c. Edit the “Title” area text, which currently reads “Total,” to your liking. I will insert the text “\\mySQLInstance\Processor(_Total)\% Processor Time”

d. I will also click on the legend box on the right and delete it.

e. Here is what my final graph will look like.

image

f. You can click on any open space within the chart to select the whole chart, then press Ctrl + C to copy, then paste the chart image into an email or Paint or your favorite editor.

g. Once you have saved the image of the chart, in the PivotTable Field List box, you can uncheck the “Processor(_Total)\% Processor Time” counter, then drag a different counter down into the Values pane.

h. You may repeat the process for as many different counters as you may wish to graph and analyze, reusing the same chart area.

5. Drilling deeper into specific time ranges.

a. Looking at the “Processor(_Total)\% Processor Time” chart I already created, I notice there was a CPU spike around the hour of 5:30 a.m. to 6:30 a.m. I want to investigate that time period more closely, and “zoom” the graph in, to that time period, so I can see in more detail what was happening during that hour.

b. In the PivotChart Filter Pane, click on the down-arrow next to “DateTime.”

image

c. Go to “Date Filters” –> and select “Between….”

image

d. Type in the beginning and ending dates/times in the format “m/d/yy h:mm am/pm” and click OK.

image

e. Your chart will now be “zoomed in” to the hour of 5:30 a.m. to 6:30 a.m. and will look something like this.

image

There are other features you may want to experiment with, such as adding trend lines, and making other changes to the display of the left axis area.

Happy charting!

Posted May 30, 2011 by Norm Enger in Excel, Microsoft SQL Server, Performance Monitoring

A PowerShell Script to Help You in Cleaning Up Old or Unused Files in Your SQL Data File Directories   Leave a comment

Here is a PowerShell script to list non-locked files in SQL Data directories as possible candidates to delete, in order to free up space. This can be useful for identifying old files that, for whatever reason, were not deleted and may be wasting disk space.

Note: before deleting any file, make sure it is not needed. For example, if a database is offline, its data files will be closed (not locked) and thus show up on this report, but you may not want to delete the files in case someone should need to bring the database back online again.

There are two modifiable sections in the script that allow you to prevent files from being evaluated based on either the file extension or the file name.

The script works by simply looping through files in the directory that you pass in as a parameter, and attempts to open the first line of each file. If a file is locked (as in a live SQL data or transaction log file), the attempt to open it and read the first line will fail, and the file will not be reported on. If the attempt to open and read the first line is successful, then the file will be reported on as a candidate for deletion.

Being the paranoid, DBA-type, I have written the script so that it will not perform the actual deletion of any files. Again I recommend that a human being look at each listed file and make the final decision of whether to delete.

I like to run the script from a PowerShell 2.0 ISE window. The example execution in the script assumes you save this script to your local system as “C:\PowerShell Scripts\UnusedFiles\UnusedFiles.PS1.” Please follow the directions within the comments of the script for a sample execution.

Here is the script. To easily copy the code, hover your mouse over the below section and click the “copy to clipboard” button in the tool bar that appears at the top right. If the copy button is not available, then click the “view source” button, press Ctrl + A to select all, and Ctrl + C to copy. Then paste into your favorite editor.

param([string]$theDir="")

#Norm Enger - 4/30/2011 - PowerShell script to list non-locked files in SQL Data directories
#as possible candidates to delete, in order to free up space. Note: before deleting any
#file, make sure it is not needed. For example, if a database is offline, its data file
#will be closed (not locked) and thus show up on this report, but you would not want to delete
#the file in case someone should need to bring the database back online again.

#To run, open a Powershell window and run a command like the following (starting with
#the amperstand "&" character)...
#& "C:\PowerShell Scripts\UnusedFiles\UnusedFiles.PS1" -theDir "C:\mySQLDirectory\Data"

#Output will be similar to the following:
#C:\mySQLDirectory\Data\New Text Document.txt (10/8/2010 4:46:33 PM)

#Array of extensions to exclude from report (change as needed)...
$arrExcludeExtensions = ".preg",".old",".cer"

#Array of file names to exclude from report (change as needed)...
$arrExcludeFileNames = "distmdl.mdf","distmdl.ldf", `
	"mssqlsystemresource.mdf","mssqlsystemresource.ldf"

#Sets the folder location
Set-Location $theDir

foreach($file in get-childitem | Sort-Object $_.Name `
	| Where-Object {$arrExcludeExtensions -notcontains $_.Extension `
	-and $arrExcludeFileNames -notcontains $_.Name})
{
#Get-Content returns error if the file is in use/locked
#We limit lines to try to read at 1 using TotalCount...
$content = Get-Content $file -TotalCount 1 -ErrorAction silentlycontinue
	if($?){
		$file.FullName.ToString() + `
		" (" + $file.LastWriteTime.ToString() + ")"
	}
	else{} #write out nothing
}

Posted April 30, 2011 by Norm Enger in Microsoft SQL Server, Powershell

Posted April 30, 2011 by Norm Enger in Microsoft SQL Server, Powershell