What is going to be my best way to search and remove duplicate’s from over 50GBs worth of text files and merge them into one? I figure a GUI app will just hang so I need a CLI style with threading support or a way to use linux.
Here is a post may help you but in small scale you can write your own script that way to handle huge number of files
Also this post may help using the sort command
Auslogics Duplicate File Finder is free and should have no problems with 50GBs of text files. Make sure you download direct from the site to avoid the CNET downloader and don’t accept the Auslogics toolbar. It’s fairly intuitive and easy to get rid of duplicates.
To concatenate your text files in windows you could try this in a cmd window as admin:
copy *.txt bigfile.txt
Which will copy all your text files into one big file. No idea if this will work with the volume of files you have.