There are at least 2 possible ways to do this:
Way 1 - Use curl in a cron job:
Use curl
(install it if it isn't already installed). curl
downloads files using any of the usual protocol -- HTTP, HTTPS, FTP. You could use curl
in a cron job to download a fresh copy of the script periodically. Use man curl
for invocation details.
curl -o /path/to/script http://www.example.com/script
You may want to download the script under a name different than the script which is executed in production, and when the download succeeds mv
it over the production script; this is to avoid any problem in case the script is called in mid-download.
Way 2 - Use a Wrapper script:
Write a wrapper which downloads the script if needed, and then exec
s or calls it (depending on the scripting language). The example is written is Bash so that the logic is easily comprehensible; for other scripting languages you will have to adapt. For example, suppose the local script is named real-script
and the URL is http://www.example.com/real-script
. The wrapper could be wrapper-script
:
#! /bin/bash if [ -f /path/to/real-script ] ; then # Local copy exists, download only if remote file is newer curl -z /path/to/real-script -o /path/to/real-script http://www.example.com/real-script else # Local copy does not exist, always download curl -o /path/to/real-script http://www.example.com/real-script fi chmod 755 /path/to/real-script exec /path/to/real-script "$@"
Call this wrapper script as if it was the real script; the wrapper will download the real script if there is no local copy or if the local copy is older than the remote file; then it will execute the guaranteed fresh copy passing any arguments.
You may also want to avoid looking for a newer version if the existing local copy is not more than 5 minutes old, or whatever time interval you allow.
Note that when using this method in production, some error checking will probably be needed.
curl
if you don't have it. Write a wrapper which downloads the script if needed and thenexec
s it.