Work within bandwidth limit
by b1bb2 from LinuxQuestions.org on (#5SR34)
I have linux shared hosting with cpanel. I want to copy many large files within my hosting account. I tried to do this using simple copy commands in a bash script. It did the copy but seems to exceed my allocated bandwidth limit: I/O Usage turned red. I was later suprised to find that the bash script kept running even after I logged out. This would not be normal behaviour for a pc, but is it normal for online? I have more copy to do and would like a better method. I could try manually copy each seporate file but this is tedious. I could try cron but doc says I should use at, and I would need to setup a seporate job for each file. I suppose a good way is to put at statements into a bash script. But I can not get at working. I tried:
Code:at -f file.sh nowYou are not authorized to run this command
Code:/bin/sh/at -f file.sh nowbash: /bin/sh/at: Not a directory
Code:man batchbash: man: command not found
I tried experimenting on my offline pc. at is a package to install. I read I need to run Code:sudo systemctl enable --now adtIt does not work. Do I need to setup LAMP?
Code:at -f file.sh nowYou are not authorized to run this command
Code:/bin/sh/at -f file.sh nowbash: /bin/sh/at: Not a directory
Code:man batchbash: man: command not found
I tried experimenting on my offline pc. at is a package to install. I read I need to run Code:sudo systemctl enable --now adtIt does not work. Do I need to setup LAMP?