The ‘dir’ command is really slow in Powershell

Since powershell is trying to do so much under the covers, it can  get silly at times.  When you have many thousand files in a directory, (which is a silly idea to begin with) powershell slows down to a crawl when the dir command bundled in ‘cmd.com’ still works great.  The easiest answer is to simply use that command instead, break you data into smaller chunks, and then powershell-ize your data after the fact.  If you don’t do this, you could really screw yourself when your powershell script appears to run slowly or stop completely. 

<pre>

cmd /c dir

</pre>

This will run the old dir command from cmd.com for you from powershell.  But what if cmd.com goes away, as Microsoft promises?  Either you must write a wrapper that you can replace with something other than powershell without any code change, or your must find that other command today.  I am thinking of a unix port like the real ‘ls’ (not powershell alias) or ‘find’.  Find suffers a similar issue in unix, but only if you give it parameters, if you just do a straight find ( find .  or find -print ), everything works as it just goes one record at a time.  Also, in powershell, what counts is how many files you have in one directory, or ask about from one command.

Leave a Reply

Your email address will not be published. Required fields are marked *