I was asked to search several teams' code-bases recently based on a list of over 270 different search strings. We are a Windows OS organization and doing a search like that would take some time. By the way, has anyone noticed that the Windows explorer search tool is flaky? Even if I used a tool like this it would still take a while.
I have cygwin installed, and I know grep works well - when I search for something I know is in the file, I find it (unlike Windows sometimes). What I didn't know was that there is a -f option on grep that allows you to provide a file with all you search terms separated by newlines. An astute worker made me aware of this and I tried it out this morning. It worked nicely and saved me some time. Here's how I implemented it:
grep -r -i -f file_of_search_strings.txt * | grep -v .svn | grep -v Assert | grep -v .sql >> resultFile.txt
-r is a recursive search based on where I'm current sitting on the command line
-i is a case insensitive search
-f allows me to pass in my file of search strings
* tells grep to search every folder/file at this level. Used with -r it tells grep to search every single folder and file that is currently at my current cursor position, or a child of the folder I've cd'ed to.
The | grep -v .svn tells grep to ignore/not display files with *.svn in the results
and then >> allows me to pipe my results to a file for inspection.
You may have to run the dos2unix command on the text file with your list of search items to get grep to see the new lines in the file correctly.
No comments:
Post a Comment