Some Linux Live USB distributions don't have `curl` installed
I've run into this a few times now (thanks Ubuntu...) and since you aren't using any crazy capabilities of curl I was able to replace curl -s -L -O $url with wget $url and curl -s -L -o filename.bin $url with wget -O filename.bin $url and everything functioned perfectly.
Perhaps we could create a fetch_file function in functions.sh that checks for the existing of curl or wget and sets the FETCH to the appropriate binary and if two arguments were passed to the function (output filename and URL) then it passes the right option to the binary that is being used to fetch.
Google has a decent example of this in their recovery.sh for creating recovery media from Linux.
https://chromium.googlesource.com/chromiumos/user-recovery-tools/+/master/linux/recovery.sh#130
https://chromium.googlesource.com/chromiumos/user-recovery-tools/+/master/linux/recovery.sh#176
Since we are dealing with tiny files vs the large recovery images, we probably don't need the extra code they added for resuming and can just have a single function.
fetch_file () {
FETCH=
FETCH_FILENAME=
URL=
if [ $# -gt 2 ]; then
echo -e "Too many arguments, expecting 'fetch_file $filename $url' or 'fetch_file $url'"
exit 1
elif [ $# -eq 2]; then
FETCH_FILENAME=$1
URL=$2
else
echo -e "Using server filename"
URL=$1
fi
# Older Bash versions may not have had `elif`, so the longer form is functional as well
if [ -z "$FETCH" ] && tmp=$(type curl 2>/dev/null) ; then
if [ -z "$FETCH_FILENAME" ]; then
FETCH="curl -s -L -o "
else
FETCH="curl -s -L "
fi
fi
if [ -z "$FETCH" ] && tmp=$(type wget 2>/dev/null) ; then
if [ -z "$FETCH_FILENAME" ]; then
FETCH="wget -q -O "
else
FETCH="wget -q "
fi
fi
if [ -z "$FETCH" ]; then
echo -e "ERROR: need \"curl\" or \"wget\""
exit 1
fi
# The second variable can be empty which won't prevent the command from executing
$FETCH $FETCH_FILENAME $URL
}
# Examples of usage
fetch_file overridden_name.bin https://mrcbx.tech/versioned_file.bin
fetch_file https://mrcbx.tech/static_file.bin
supporting both curl and wget seems silly since you need one or the other to download the script in the first place. I'm not sure why having to run sudo apt-get update && sudo apt install curl beforehand is problematic.
Some users may download the firmware-util.sh by visiting the URL and then copying/running it from $HOME/Downloads if they get an error when trying to use curl, definitely not a required feature, but sometimes when running from a Live CD there isn't enough room in the temporary partitions to do the apt update and apt install.
well, IIRC, the default behavior of wget is not to overwrite an existing file, so one would need to use the -O option to ensure the same behavior as with curl -LO, so that adds a little complexity. I don't know that I'd spend any time implementing it myself, but I'd certainly take a PR
Also, with current implementation one has to install curl with apt and not snap (which is not obvious):
#184