[SOLVED] Download a files from website with specified range using bash script.
by TashiDuks from LinuxQuestions.org on (#545RB)
Download a files from website with specified range using bash script.
For example a website has a files in following manner:
http://localhost/log/log01.log
http://localhost/log/log02.log
... log100.log
The the bash script should download only a range specified by users. Following is my script which does not download.
Code:#!/bin/bash
echo "Enter the range of image you want to download"
read -p "Enter first range of image. log: " StartRange
read -p "Enter Last range of image. log: " EndRange
for i in {$StartRange..$EndRange}; do
wget -q -P DownloadFolder/ http://localhost/log/log$i.log
doneI also want to output my download progress as follow for each file:
Downloading [filename], with the file name [filename and extension], with a file size of [file size and metric]
PLEASE HELP ME TO FIX ABOVE CODE..
P.S. This question demands the bash script to be done with "wget", "curl" only.
Thanks


For example a website has a files in following manner:
http://localhost/log/log01.log
http://localhost/log/log02.log
... log100.log
The the bash script should download only a range specified by users. Following is my script which does not download.
Code:#!/bin/bash
echo "Enter the range of image you want to download"
read -p "Enter first range of image. log: " StartRange
read -p "Enter Last range of image. log: " EndRange
for i in {$StartRange..$EndRange}; do
wget -q -P DownloadFolder/ http://localhost/log/log$i.log
doneI also want to output my download progress as follow for each file:
Downloading [filename], with the file name [filename and extension], with a file size of [file size and metric]
PLEASE HELP ME TO FIX ABOVE CODE..
P.S. This question demands the bash script to be done with "wget", "curl" only.
Thanks