I just moved my terabyte of data from the last 12 years to a Buffalo NAS, but it is in a directory structure that dates back to when I had 50 CD's for backup's, so one of my folders is "CD Archives" and there are 50 folders inside that named with the date of the CD backup. I used Supercat for over 8 years now & catalogued each of those CD's when I first began keeping copies on at least 2 external hard drives (I still have the CD's of course too) The rest of my top level directories in my NAS is the quarterly backup's from the last 4 years since upgrading from CD's.
Each folder of back-up data is named with the backup date. Each folder is catalogued by supercat which simply captures file names, dates, file sizes, and the folder it resides in. This means the "catalogue" file for a 100 gig backup is less than 1 mg of data....
...so, as you can imagine, running a search of a NAS with a terabyte of data is going to take some time... running a search of the catalogues will find the keyword in a split second, & I can then see all the hits on that keyword, filenames, dates, sizes & the full path of folders withing folders to direct me to that file.