Did you know Linux Journal has several dozen ebooks you can download for free? No login required. https://www.linuxjournal.com/books
#Linux #opensource #FOSS #sysadmin #programming #security #CloudComputing #containers #devops #drupal #Docker #Kubernetes #SQL ...and on and on
@linuxjournal I hope you are OK with this script existing, btw ;)
@linuxjournal
#!/usr/bin/env bash
base_url="https://www.linuxjournal.com/books"
for i in $(curl -s $base_url | grep field--name-field-image | awk -F'href=' '{print $2}' | cut -d'>' -f1 | tr -d '"'); do
url2='https://www.linuxjournal.com'$i
tmp=$(curl -s $url2 | grep 'sites/default/files/201' | tail -1 | awk -F'href=' '{print $2}' | cut -d'>' -f1 |tr -d '"')
wget https://www.linuxjournal.com/node/$tmp
done