0

I have a long script (>500 rows) I load in each bootstrapping of a webpage, from a local file. The source code of this script is often updated and presented under a certain URL so that all users could update their local files containing it (because it's that heavy many prefer loading it locally).

Not updating this script frequently expose my website for security breaches, but instead of copying and pasting the data each time a new to update, I want to reload it into my local automatically with a Cron task (say 0 0 1 * 0).

Is there a way in Unix, to copy and paste all data from a URL accessible script into a local script file?

I guess this will include a rewrite like << EOF > ... EOF But this time, not from my own local standard input in Bash prompt, but rather a remote file.

2
  • Install curl if you don't have it. Write a wrapper which downloads the script if needed and then execs it.
    – AlexP
    CommentedDec 7, 2016 at 11:51
  • What kind of wrapper you mean to? You are welcome to publish an answer with example. Thanks,
    – user149572
    CommentedDec 7, 2016 at 14:28

1 Answer 1

2

There are at least 2 possible ways to do this:

Way 1 - Use curl in a cron job:

Use curl (install it if it isn't already installed). curl downloads files using any of the usual protocol -- HTTP, HTTPS, FTP. You could use curl in a cron job to download a fresh copy of the script periodically. Use man curl for invocation details.

curl -o /path/to/script http://www.example.com/script 

You may want to download the script under a name different than the script which is executed in production, and when the download succeeds mv it over the production script; this is to avoid any problem in case the script is called in mid-download.

Way 2 - Use a Wrapper script:

Write a wrapper which downloads the script if needed, and then execs or calls it (depending on the scripting language). The example is written is Bash so that the logic is easily comprehensible; for other scripting languages you will have to adapt. For example, suppose the local script is named real-script and the URL is http://www.example.com/real-script. The wrapper could be wrapper-script:

#! /bin/bash if [ -f /path/to/real-script ] ; then # Local copy exists, download only if remote file is newer curl -z /path/to/real-script -o /path/to/real-script http://www.example.com/real-script else # Local copy does not exist, always download curl -o /path/to/real-script http://www.example.com/real-script fi chmod 755 /path/to/real-script exec /path/to/real-script "$@" 

Call this wrapper script as if it was the real script; the wrapper will download the real script if there is no local copy or if the local copy is older than the remote file; then it will execute the guaranteed fresh copy passing any arguments.

You may also want to avoid looking for a newer version if the existing local copy is not more than 5 minutes old, or whatever time interval you allow.

Note that when using this method in production, some error checking will probably be needed.

2
  • As I said under point 1, just use curl: curl -o /path/to/script http://www.example.com/script. curl is available for all kinds on Linux, and for Windows. Read the curl manual page, it's not long or complicated.
    – AlexP
    CommentedDec 7, 2016 at 22:38
  • Surly, it's long or complicated, I thought your explanation dealt only with a 3-steps solution instead of 2 steps (I will add his as side note). I suggest comment deletion for minimal answer.
    – user149572
    CommentedDec 8, 2016 at 2:34

You must log in to answer this question.