Tag Archive for 'bash'

Data to S3 with command line and encryption

This article details how to archive folders with their contents, encrypt the archives, upload them to S3, and remove the local copies.  A typical usage might be to archive the version of a website that I have with twenty thousand files on a certain date before changing the folder layouts of images for ten thousand images.

With about 345 GB of data in S3, I run into some specific situations that merit custom scripts to solve issues. One of these is that I like to create a snapshot of a folder at specific point in time before making adjustments to the scripts and software within. I also do this for documents and other folders so that a point-in-time backup exists. This way, if I decide I want a previous folder structure and content, I can just unzip the archive I created. The purpose of these backups is for a point-in-time previous version of a website or web-application in case I change my mind later on. They are rarely used. I need to copy a folder, and all of it’s contents recursively, and save the permissions, such as the execute permission on scripts. Then, to take that zip file and encrypt it using AES256 and upload it to an S3 bucket, and then delete the local zip file and encrypted file to free the disk space. The second component is a decryption script for when I download such a file and want to extract it.

There are some factors to consider, and one is storage costs. Encrypting the archives adds about 30% to the file size. S3 has an automatic transition to a lower cost storage tier available, so I set that on for the backup to use for the longer term storage.  The Standard Information Assurance tier is just over a penny per GB per month.  The second major consideration is security.  Amazon offers encryption of the S3 buckets and their contents using keys managed by Amazon. I turn this on.  My concern is hackers grabbing the data using stolen keys, and not avoiding the NSA or other government agencies.  If anyone will have Quantum computers, it will be the NSA, so if they want the data, they will get it.  With this in mind, I chose symmetric encryption and to use OpenSSL for the encryption.  GnuPG is a very popular and recommended application for this, and there are arguments against using OpenSSL for this purpose.  Ultimately it came down to the fact that the objective is using AES encryption with a password that can be passed in scripts rather than using public and private keys.  It may be a risk that Amazon’s encryption keys could be compromised allowing users to read the data. It is also likely that they have intrusion detection systems which would detect hackers attempting to brute force the passwords on the files within an S3 bucket.  It may also be that attackers downloaded all the files and attempt to brute force them locally.  I plan more details on this choice in a later blog post.

The choices are Zip file format so that the archives can be viewed on multiple platforms with ease, *nix file permission preservation on Linux, and AES encryption prior to upload, and decryption when the file is downloaded. To accomplish this, I created a folder in the home directory called Archives.  This Archives folder is where the zip files are created.  The next step is the creation of two scripts. I called these scrypts myencrypt and mydecrypt.  I store these in the home folder in a subdirectory called Scripts.  The next piece is two functions within the .bashrc file so that I can use the whole process at the command line.

myencrypt

#!/usr/bin/env bash

# $1 = input file
# $2 = output file
openssl enc -aes-256-cbc -salt -a -p -in $1 -out $1.encrypted -k “some-cool-password”

mydecrypt

#!/usr/bin/env bash

# $1 = input file
# $2 = output file
openssl enc -aes-256-cbc -a -d -p -in $1 -out $2 -k “some-cool-password”

bashrc, ltsarchive()

ltsarchive() { timestamp=$(date +”%Y-%m-%d-%H%M%p”) && read -p “Enter folder name: ” name && zip -rv9 ~/Archives/$name-$timestamp.zip $name && ~/Scripts/myencrypt ~/Archives/$name-$timestamp.zip && aws s3 cp ~/Archives/$name-$timestamp.zip.encrypted s3://bucket-name/long-term-archives/$name-$timestamp.zip.encrypted && rm -f ~/Archives/$name-$timestamp.zip && rm -f ~/Archives/$name-$timestamp.zip.encrypted; }

bashrc, ltcsdecrypt()

ltsdecrypt() { filepath=$(pwd) && read -p “Enter file name: ” name && newname=${name::-9} && ~/Scripts/mydecrypt $filepath/$name $filepath/$newname; }

For example, I would go into a directory and want to archive the Templates directory and all the contents.  I would type ltsarchive and then the shell would prompt me asking for the name of the folder.  I would then type Templates and the function runs and first zips the Templates folder and subcontents into ~/Archives/Templates-2018-07-23-2037PM.zip.  Then, those files are passed to OpenSSL which encrypts that file to Templates-2018-07-23-2037PM.zip.encrypted.  Then, that file uploads to the long term storage folder in the specified S3 bucket.  After that, the zip file and the .encrypted file are deleted from ~/Archives/.  Upon checking the S3 storage bucket, Templates-2018-07-23-2037PM.zip.encrypted appears.

Then, the files can be downloaded via the web browser or through the command line interface at a later time.  To decrypt Templates-2018-07-23-2037PM.zip.encrypted, we go into the directory at the shell and type ltsdecrypt.  It asks for the file name, so we type Templates-2018-07-23-2037PM.zip.encrypted.  It decryptes the file leaves Templates-2018-07-23-2037PM.zip in the same folder as the downloaded file.  These scripts can then be used in automation.

 

Yum update memory allocation issues

Yum Update Memory Allocation issues on 512M Virtual Machines:

Create a Swap file:

dd if=/dev/zero of=/swapfile bs=1024 count=655366
chmod 600 /swapfile
mkswap /swapfile
swapon /swapfile

To disable and remove after updates are complete:
swapoff -a
rm -f /swapfile

High Quality compressed HPLIP scans

HPLIP produces scans far better than what I can accomplish on Windows, even with Adobe Acrobat Standard.  The Auto Document Feeder scans in 8.3 megapixel png files, and then hplip converts them to a multiple-page PDF of extremely high quality.  This is great for one page documents, but for 10 or 12 page documents, such as faxes, the file size becomes a problem due to email limitations.  Ghostscript can compress these.  This creates several steps unless the process is simplified.  To that end, here is a shell script that scans from the ADF using HPLIP and compresses the PDF output to the prepress/best setting, as mentioned in the article on Compressing PDFs in Bash.

This assumes that there is a ~/Temp directory, and a ~/Scans directory for scans to go in.  Additionally, the username must be replaced in the script a few places.  The CP command refused to inflate the ${USERNAME} variable, and so it has to be coded into the script.

#!/usr/bin/env bash
YMD=$(date +"%Y-%m-%d")
#  Create a folder, ~/Temp and create a folder, ~/Scans
FILEOUTPUT="/home/${USERNAME}/Scans/$1-scanned-${YMD}.pdf"
hp-scan --adf --mode=color &&
PART=hpscan
LATESTSCAN=`ls -t ${PART}*pdf | sed "1q"`
echo "${LATESTSCAN} ready for ${FILEOUTPUT}"
#  Why you shouldn't parse the output of ls
#  http://mywiki.wooledge.org/ParsingLs 
#  This does not use the FILEOUTPUT variable itself because the cp command
#  seems to have trouble copying the file correctly, even with switches. 
#  Change the 'username' without braces to the the actual username.
cp ${LATESTSCAN} "/home/username/Temp/$1-uncompressed-scanned-${YMD}.pdf"
gs -dNOPAUSE -dBATCH -sDEVICE=pdfwrite -dCompatibilityLevel=1.4 \
-dPDFSETTINGS=/prepress \
-sOutputFile="/home/username/Scans/$1-scanned-${YMD}.pdf" ${LATESTSCAN}
sleep 1
rm -f ${LATESTSCAN}
rm -f "/home/username/Temp/$1-uncompressed-scanned-${YMD}.pdf"
killall evince

Compressing PDFs in Bash

I have scripts for hplip to scan PDF’s to folders using the ADF. These PDFs are very large. To convert these to smaller file-sizes for emailing, I added the following functions to .bashrc.

mediumpdf() { read -p "Enter input PDF filename: " name \
&& gs -dNOPAUSE -dBATCH -sDEVICE=pdfwrite -dCompatibilityLevel=1.4 \
-dPDFSETTINGS=/printer -sOutputFile=medium-quality-$name $name; }
smallpdf() { read -p "Enter input PDF filename: " name \
&& gs -dNOPAUSE -dBATCH -sDEVICE=pdfwrite -dCompatibilityLevel=1.4 \
-dPDFSETTINGS=/ebook -sOutputFile=low-quality-$name $name; }
tinypdf() { read -p "Enter input PDF filename: " name \
&& gs -dNOPAUSE -dBATCH -sDEVICE=pdfwrite -dCompatibilityLevel=1.4 \
-dPDFSETTINGS=/screen -sOutputFile=lowest-quality-$name $name; }
bestpdf() { read -p "Enter input PDF filename: " name \
&& gs -dNOPAUSE -dBATCH -sDEVICE=pdfwrite -dCompatibilityLevel=1.4 \
-dPDFSETTINGS=/prepress -sOutputFile=best-quality-$name $name; }

Using a 32.9MB test PDF straight from HPLIP, bestpdf created a 10.5MB with very good quality that was almost indistinguishable from the original.  Mediumpdf created a 7.8MB that looked good enough for faxing with some color distortion.  Smallpdf created a 1.5MB that might be good enough for faxing.  Tinypdf created a 482kB with clear color distortions.

Zip files and UUID from Bash

Two quick items for .bashrc.  The archive command takes a folder name and zips its contents.  The UUID produces a unique number easily.

archive() { read -p "Enter folder name: " name && zip -rv9 $name-$(\
            date +"%Y-%m-%d-%H%M%p").zip $name; }
uuid() { UUID=$(cat /proc/sys/kernel/random/uuid) && echo $UUID; }

Timezone and .bashrc

Changing Linux instances to a local time-zone from UTC. I found the easiest way thanks to ajtrichards.

sudo m /etc/localtime
sudo ln -s /usr/share/zoneinfo/America/Chicago /etc/localtime

For the local machine, to have a calculator in the command line, and the Zen Burn color scheme, and a title for the terminal window, this works as a good .bashrc.

 # .bashrc
 # Source global definitions
 if [ -f /etc/bashrc ]; then
 . /etc/bashrc
 fi
 # Uncomment the following line if you don't like systemctl's auto-paging feature:
 # export SYSTEMD_PAGER=
 # User specific aliases and functions
 calc () { echo "$*" | bc -l; }
 PROMPT_COMMAND='echo -ne "\033]0;Fedora 25: ${PWD}\007"'
 # Zen Burn color Scheme
 # Thanks to https://github.com/sonatard/color-theme-zenburn
 echo -ne '\e]12;#BFBFBF\a'
 echo -ne '\e]10;#DCDCCC\a'
 echo -ne '\e]11;#3F3F3F\a'
 echo -ne '\e]4;0;#3F3F3F\a'
 echo -ne '\e]4;1;#705050\a'
 echo -ne '\e]4;2;#60B48A\a'
 echo -ne '\e]4;3;#DFAF8F\a'
 echo -ne '\e]4;4;#506070\a'
 echo -ne '\e]4;5;#DC8CC3\a'
 echo -ne '\e]4;6;#8CD0D3\a'
 echo -ne '\e]4;7;#DCDCCC\a'
 echo -ne '\e]4;8;#709080\a'
 echo -ne '\e]4;9;#DCA3A3\a'
 echo -ne '\e]4;10;#C3BF9F\a'
 echo -ne '\e]4;11;#F0DFAF\a'
 echo -ne '\e]4;12;#94BFF3\a'
 echo -ne '\e]4;13;#EC93D3\a'