| 1. |
Solve : Need Help in appending a substring of previous line to next subsequent lines? |
|
Answer» You can right justify the file size in an (e.g.) 16 char wide column quite easily - then you can use sort /R /+N on the output file - N (starting column for sort) will depend on the date and time field widths which depend on local SETTINGS. and my expected output should be Forcing the size data into a 16 byte width seems unnecessary as does using the sort command. Excel with import it correctly. The new model of this snippet from *censored* has gone from I/O bound to to CPU bound. Is this an improvement? [that's a rhetorical question ST] Code: [Select]@echo off rem for /f "usebackq tokens=1" %%v in (`mountvol ^| find ":\"`) do ( for /f "tokens=* delims=" %%i in ('dir "%%v" /b /s /a-d /o:-s') do ( echo %%~ti,%%~zi,%%~nxi,%%~dpi >> Final_Master.csv ) ) Quote from: Sidewinder on July 17, 2011, 02:32:35 PM Based on the OP reply #1, the files are sorted correctly coming out of the dir command and require no further processing In a later post he says he WANTS a LIST of files in each folder, on each drive letter on his server, sorted in descending order of file size... Quote 1. I need to search each and directory and file in each drive and sort all the files according their size in descending order. i.e Largest file in the server to smallest file with path where it is existing. Dir sorts the files for each folder independently, so a further sort will be necessary. Quote Forcing the size data into a 16 byte width seems unnecessary as does using the sort command. Excel with import it correctly. I asked the OP whether he had Excel but got no reply. Quote The new model of this snippet from *censored* has gone from I/O bound to to CPU bound. Is this an improvement? [that's a rhetorical question ST] I'm not sure... Quote from: Sidewinder on July 17, 2011, 02:32:35 PM Forcing the size data into a 16 byte width seems unnecessary as does using the sort command. Excel with import it correctly. I just found that when importing a csv that Excel 2003 has a maximum number of rows limitation of 65,536 - Excel 2007 is 1,048,576 rows Anyhow, Powershell is much quicker than batch - I catalogued my whole computer in 2 minutes 35 seconds; using batch and dir took 50 minutes 51 seconds. 320,000 files approx in around 62,000 folders Found 25 GB of forgotten stuff to delete |
|