Data to S3 with command line and encryption

This article details how to archive folders with their contents, encrypt the archives, upload them to S3, and remove the local copies.  A typical usage might be to archive the version of a website that I have with twenty thousand files on a certain date before changing the folder layouts of images for ten thousand images.

Continue Reading

High Quality compressed HPLIP scans

HPLIP produces scans far better than what I can accomplish on Windows, even with Adobe Acrobat Standard.  The Auto Document Feeder scans in 8.3 megapixel png files, and then hplip converts them to a multiple-page PDF of extremely high quality.  This is great for one page documents, but for 10 or 12 page documents, such…

Continue Reading

Compressing PDFs in Bash

I have scripts for hplip to scan PDF’s to folders using the ADF. These PDFs are very large. To convert these to smaller file-sizes for emailing, I added the following functions to .bashrc. mediumpdf() { read -p “Enter input PDF filename: ” name \ && gs -dNOPAUSE -dBATCH -sDEVICE=pdfwrite -dCompatibilityLevel=1.4 \ -dPDFSETTINGS=/printer -sOutputFile=medium-quality-$name $name; }…

Continue Reading

Zip files and UUID from Bash

Two quick items for .bashrc.  The archive command takes a folder name and zips its contents.  The UUID produces a unique number easily. archive() { read -p “Enter folder name: ” name && zip -rv9 $name-$(\ date +”%Y-%m-%d-%H%M%p”).zip $name; } uuid() { UUID=$(cat /proc/sys/kernel/random/uuid) && echo $UUID; }

Continue Reading