I have this script to go through a list of URLs and the check return codes using Curl.
Links file goes like this:
https://link1/... https://link2/... https://link200/... (...)
The script:
INDEX=0 DIR="$(grep [WorkingDir file] | cut -d \" -f 2)" WORKDIR="${DIR}/base" ARQLINK="navbase.txt" for URL in $(cat $WORKDIR/$ARQLINK); do INDEX=$((INDEX + 1)) HTTP_CODE=$(curl -m 5 -k -o /dev/null --silent --head --write-out '%{http_code}\n' $URL) if [ $HTTP_CODE -eq 200 ]; then printf "\n%.3d => OK! - $URL" $INDEX; else printf "\n\n%.3d => FAIL! - $URL\n" $INDEX; fi done
It takes a little while to run through every URL, so I was wondering how to speed those up curl requests. Maybe I could use some parallel Curl requests, but using "xargs" inside a "for" loop while also printing a message doesn't seem the way to go.
I was able to use "xargs" out of the script and it sort of works, although not showing the correct HTTP code.
cat navbase.txt | xargs -I % -P 10 curl -m 5 -k -o /dev/null --silent --head --write-out '%{http_code}\n' %
I couldn't find a way to insert that into the script.
Any tips?