1.

Solve : Batch file: Deleting duplicate files within similar folder structures?

Answer»

Create a batch file to only delete DUPLICATE files (same filename, date/time) in the same relative FOLDER location.  New folder\files along with the .bat file are placed here: \TEST\NEW\.  Old files are in \TEST\OLD\.
Folder \OLD\ contains many folders that have the same structure as the folder under \NEW\.  The intent is to delete all of the duplicate files under \OLD\ which are the same filename, date\time, and relative location in the folder structure.
\TEST\OLD\1\15\1a.txt or \TEST\OLD\3\15\1a.txt would be deleted if \TEST\NEW\1\15\1a.txt existed and has the same date\time. 

I started with which does not seem to work -

CODE: [Select]for %%F in ("d:\test2\new\**") do (
    if exist "d:\test2\old\%%~nxF" del "d:\test2\old\%%~nxF"
)Your SCRIPT only deletes files in D:\OLD\ (just that folder) if the same name file exists in D:\NEW\ (just that folder). However, you seem to want to go deeper in the folders, and there is nothing about file date in your script.This is what I've been able to research/apply.  As before, the following deletes all occurrences of 1a.txt within the \Old\ path structure regardless of file modified date or size and path.  Even if a file of the same name/date/size exist within the \Old\ structure, only those which also match the path from the fourth level and below are to be deleted.

Code: [Select]SETLOCAL enableextensions
pushd "D:\test2\old\"
for /F "delims=" %%G in ('dir /B /S /A:-D *.*') do (
  call :FileComp "%%~fG" "D:\test2\new\%%~nxG"
)
popd
ENDLOCAL
goto :eof

:UpError
exit /B %1

:FileComp
  call :UpError 321
  fc /B "%~1" "%~2" >NUL 2>&1
  if not %errorlevel% EQU 0 (
    del "%~1"
  ) else (
    echo %errorlevel% "%~2"
  )
goto :eofHi 
I don't know if this can be helpful to you in your situation ==> Compare files from two folders using HASH SHA1 in BATCHI'd use long filenames so that every copy would have a unique name
Put them all in one folder
and look at them in SIZE order..
manually deleting those that are duplicates.



Discussion

No Comment Found