1.

Solve : Need Help in appending a substring of previous line to next subsequent lines?

Answer»

You can right justify the file size in an (e.g.) 16 char wide column quite easily - then you can use sort /R /+N on the output file - N (starting column for sort) will depend on the date and time field widths which depend on local SETTINGS.

A tip with sort - specifying an output file for example: sort /R /+24 Unsorted.txt /O Sorted.txt is quicker than redirecting stdout to a file (according to Microsoft): sort /R /+24 Unsorted.txt > Sorted.txt

I have actually tried this method on my system and I found I was getting some blank date/time/size fields for certain files, about 20 out of 600,000, whereas using the full DIR listing didn't do that.

Code: [Select]setlocal enabledelayedexpansion
for /f "usebackq tokens=1" %%v in (`mountvol ^| find ":\"`) do (
for /f "tokens=* delims=" %%i in ('dir "%%v" /b /s /a-d /o:-s') do (
set fsize=%%~zi
set fsize= !fsize!
set fsize=!fsize:~-16!
echo %%~ti,!fsize!,%%~nxi,%%~dpi >> Unsorted_Intermediate.csv
)
)
Admittedly the findstr commands are not needed. They were leftover from when testing without the /b switch.

BASED on the OP reply #1, the files are sorted correctly coming out of the dir command and require no further processing, while the echo command formats the csv file and gang punches the proper folder name into each record.

Quote

and my expected output should be

07/01/2011 02:29 PM 17,503,511 Conf_Logs.log D:\
07/06/2011 09:07 PM 3,013,934 Master.txt D:\
12/01/2010 11:48 AM 1,383,424 CRBT_FRR_ID_50.doc D:\
01/17/2011 12:49 PM 1,042,944 SpainFCA_Installation Note.doc D:\


01/08/2011 07:18 PM 3,178,425 IMG_2801.JPG D:\Accord_Get2Gthr_Photos
01/08/2011 07:18 PM 2,879,090 IMG_2802.JPG D:\Accord_Get2Gthr_Photos
01/08/2011 07:18 PM 2,856,676 IMG_2803.JPG D:\Accord_Get2Gthr_Photos
01/08/2011 07:35 PM 2,745,495 IMG_2818.JPG D:\Accord_Get2Gthr_Photos
01/08/2011 08:50 PM 2,685,050 IMG_2838.JPG D:\Accord_Get2Gthr_Photos
01/08/2011 07:14 PM 2,663,947 IMG_2799.JPG D:\Accord_Get2Gthr_Photos



01/08/2011 08:40 PM 100,078,852 MVI_2833.AVI D:\Accord_Get2Gthr_Photos\vid D:\Accord_Get2Gthr_Photos\vid
01/08/2011 07:47 PM 10,844,132 MVI_2827.AVI D:\Accord_Get2Gthr_Photos\vid
01/08/2011 07:47 PM 11,458 MVI_2827.THM D:\Accord_Get2Gthr_Photos\vid

Forcing the size data into a 16 byte width seems unnecessary as does using the sort command. Excel with import it correctly.

The new model of this snippet from *censored* has gone from I/O bound to to CPU bound. Is this an improvement? [that's a rhetorical question ST]

Code: [Select]@echo off

rem for /f "usebackq tokens=1" %%v in (`mountvol ^| find ":\"`) do (
for /f "tokens=* delims=" %%i in ('dir "%%v" /b /s /a-d /o:-s') do (
echo %%~ti,%%~zi,%%~nxi,%%~dpi >> Final_Master.csv
)
)

Quote from: Sidewinder on July 17, 2011, 02:32:35 PM
Based on the OP reply #1, the files are sorted correctly coming out of the dir command and require no further processing

In a later post he says he WANTS a LIST of files in each folder, on each drive letter on his server, sorted in descending order of file size...

Quote
1. I need to search each and directory and file in each drive and sort all the files according their size in descending order. i.e Largest file in the server to smallest file with path where it is existing.

Dir sorts the files for each folder independently, so a further sort will be necessary.

Quote
Forcing the size data into a 16 byte width seems unnecessary as does using the sort command. Excel with import it correctly.

I asked the OP whether he had Excel but got no reply.

Quote
The new model of this snippet from *censored* has gone from I/O bound to to CPU bound. Is this an improvement? [that's a rhetorical question ST]

I'm not sure...

Quote from: Sidewinder on July 17, 2011, 02:32:35 PM
Forcing the size data into a 16 byte width seems unnecessary as does using the sort command. Excel with import it correctly.

I just found that when importing a csv that Excel 2003 has a maximum number of rows limitation of 65,536 - Excel 2007 is 1,048,576 rows

Anyhow, Powershell is much quicker than batch - I catalogued my whole computer in 2 minutes 35 seconds; using batch and dir took 50 minutes 51 seconds.

320,000 files approx in around 62,000 folders

Found 25 GB of forgotten stuff to delete


Discussion

No Comment Found