a collection of little things


I have a terrible method for collecting all of the little tips and tricks I use at work.

The method is this: I’ll mention that I did a cool thing on my work log.

Thats it. No standard tips and tricks wiki page, or github gists, nothing.

So, here they are

FreeBSD PKG Query

When we upgrade systems from 9 to 10, part of that step is to ensure all packages have been upgraded. This works right off the bat most of the time, but occasionally I have a custom built port that isn’t in the available repos.

So, I used ‘‘‘pkg query’’’

$ pkg query -e 'q=freebsd:9:x86:64' %n

Salt - Compound Matching

While we still work on an external pillar system, I’ve had to get creative on some Windows software deployments.

Windows desktops can receive our ERP software, but, not servers. We tend to stand up desktop OS versions as “server”, so its not as simple as matching the osversion grain.

So, I’ll use a compound matching statement to apply a state to all Windows systems, except those we need to exclude:

salt -C 'G@os:Windows and not E@(SERVICE|ROES|SMUG)' state.sls erp.net

Or, find all windows systems with an older version of salt (AMD64):

salt -C '* and G@os:windows and G@saltversion:0.* and G@cpuarch:AMD64' grains.item saltversion
  saltversion: 0.17.4
  saltversion: 0.17.2
  saltversion: 0.17.2

Using CloudFront

We have a product catalog that is currently hosted on our public web server, that catalog and all of its contents is pretty large, and its accessed by every single user of our ROES software.

Having it at a single location has always been a sore spot, and its at a colo in California, which means east coast users have a much longer transit time than our west coast customers.

So, I started to look at using CloudFront as a CDN.

The first tool I used was s3cmd, which integrates into a Jenkins job. Whenever a new file is checked into the repo, s3cmd will sync it to a bucket:

s3cmd sync -r -P $WORKSPACE/ s3://templates-us01 --exclude "*.svn/*" --acl-public --cf-invalidate 

Deployment and cache invalidation has been working, but I had to update our XML based catalog.

First, the catalog was serialized, and became very difficult for me to work with. So, I used XML Lint, sed, and awk to convert the entire xml file’s contents to use our new CDN:

xmllint --format TEST.xml  > pretty-test.xml
sed -e 's|http://www.bayphoto.com/Roestemplates/|http://roes-templates.bayphoto.com/|g' pretty-test.xml > cloud-front-test.xml
# avert your eyes, ugly piping here :)
awk '/nodeimage="http:\/\/roes-templates.bayphoto.com\//{ print $(NF-1)}' cloud-front-test.xml| grep http | sort -u | awk -F\" '{print $2}' > nodeimages.txt

I then decided I needed to do a little Q&A, make sure everything is there:

for i in `cat nodeimages.txt`
  curl -s -o /dev/null -w "%{http_code} %{url_effective}\n" "$i"

This produced a list of items that are in the catalog, but not on the CDN.

As it turns out, those have been broken for a long time :)

SCP Files Older Than $N Days

Sometimes we have to mix and match our tools, scp, rsync, and zfs send|recv seem to be my go to:

scp  $(find . -type f -mtime -14d) storage:/data/DONE/

Updating ZFS ACLs

Once you set an ACL to inherit from the parent, its all fine and good until you decide that you need to update ALL files with a new ACL.

We have a group of users that need the read_set permission, but it needs to be applied to the parent folder as well as all child objects (files and directories).

Since files and directories have slightly different ACL’s, I used a quick find and xargs command to apply to both:

Setting on top level shares:

setfacl -m group:ro_users:read_set:fd:allow
This only effects newly created data, so I still have to run:
find . -type f -print0 | xargs -0 -n 3 setfacl -m group:ro_users:read_set:allow
find . -type d -print0 | xargs -0 -n 3 setfacl -m group:ro_users:read_set:fd:allow

Generating SSL CSR

I’m sure this is one of this little things that many sysadmins google on a regular basis.

I use a simple script:

openssl req -out $1.csr -new -newkey rsa:4096 -nodes -keyout $1.key
$ ./newcsr.sh hostname.domain.com


Use (http://tmux.sourceforge.net/)[tmux]

Thats it

Seriously, stop what you are doing and install it. Fire it up, force yourself to use it for the next week non-stop.

I cannot stress how useful tmux is for myself, and the team I work with.

Its obvious to say that long-running commands or jobs should never be on a terminal session that can be killed off by a network connection reset, or you accidentally closing a laptop lid.