seems to work fine Python script that should do the job, but I'd be curious to know if Python will still work after running it.
pyc_files = 
py_files = 
for root, dirnames, filenames in os.walk('.'):
for filename in filenames:
for py_file in py_files:
if py_file + 'c' not in pyc_files:
cp -r * except dont copy any .pdf files - copy a directory subtree while excluding files with a given extension
will help you Bash can't help here, unfortunately. Many people use either tar or rsync for this type of task because each of them is capable of recursively copying files, and each provides an --exclude argument for excluding certain filename patterns. tar is more likely to be installed on a given machine, so I'll show you that.
tar -cC /var/www . | tar -x
tar -cC /var/www --exclude '*.pdf' . | tar -x
tar -cC /var/www --exclude '*.pdf' --exclude '*.txt' . | tar -x
Gulpfile task to copy files should optimize images if there are any
Any of those help Turns out problem was gulp-if resolving action before the condition. Condition needs to be put outside the vinyl stream. So first I synced the glob to get the files, then I determined whether there are any supported images and saved this to a bool. Then when calling gulp-if I had to condition this bool and nest in another gulp-if to filter.