Fallout 4 Comparison: Bunker Hill In Real Life v. In-Game

A few months ago, I took a new job that had my family and I relocate from Balitmore to Boston. Getting settled into the new gig, and getting settled into the new home routine have both contributed to my lack of time to wrtie, here.

We arrived just in time for the notorious Boston winter. We also arrived just in time for the release of Fallout 4, which takes place in a fictionalized version of Boston some 250 years in the future. The apartment we're renting is only a few hundred feet from the historic Bunker Hill Monument, which in-game, has become a refuge for traders retired from travling the wasteland of the post-apocolyptic Commonwealth.

This morning, the sun was out and I had a little time to take some pictures of Bunker Hill for a comparison with its in-game representation.

Micro 4/3 and the Olympus OM-D EM-1

I’ve been a photographer since high school. I’ve usually been a hobbyist, occasionally making some money, rarely making the bulk of my income from photography. I started with a Minolta STSi, eventually working up to a Nikon F5 before I made my move to digital with a D200. When that started to gather dust, I moved on to a few point-and-shoots, experimenting with not having a SLR; leaning on my phone for day-to-day shots, and a small camera when I thought I’d use it. In practice, that marked the period of my life where I took the absolute fewest pictures. That was a bummer.

I’ve been curious about the new (relatively speaking) compact mirrorless camera systems. I researched the two main systems, APS-C versus Micro Four Thirds (MFT). The more I read, the clearer it became that MFT is winning the battle despite the documented advantages of APS-C’s larger sensor size[1]. In my research, there were more lens options available for MFT than APS-C, which I took as an an indicator the platform’s momentum. Hell, the Wirecutter has a writeup of the MFT lenses you should investigate for your new camera, but such an article doesn’t exist for APS-C. I ordered an Olympus OM-D EM–1. Plus, the camera is gorgeous in silver, which certainly didn’t hurt.

I ordered two lenses with the camera: an Olympus M.Zuiko Digital 17mm f/2.8, and an Olympus M. Zuiko Digital 45mm f/1.8[2] - 34mm and 90mm DSLR/35mm equivalent focal lengths, respectively.

On an overcast day, the 45mm didn't have any trouble snapping to focus, and freezing Tex and his favorite ball. 

On an overcast day, the 45mm didn't have any trouble snapping to focus, and freezing Tex and his favorite ball. 

Olympus’ autofocus is fast and accurate. Its face detection is helpful, in that it tries to intelligently focus on faces in the frame as best as it can, and it usually does a good job choosing the faces that are important based on the focal plane. I’ve been really impressed with the speed of the system, but I wonder how much of that is just the progression of technology overall since I bought my D200 in 2003.

Despite the lower f-stop, I’ve found that the 17mm is almost always the one I leave on the body. Maybe once the boy starts running more than fifteen feet away, I’ll use the 45mm more frequently. It’s a good all-around lens. It’s not super fast, or super sharp, but it also sat at a price-point where I don’t get worried about it always being on my camera. On the other hand, there have been a couple times I wished I had sprung for the $150 more expensive f/1.8 version of the lens. I might, yet.

Either way, once again having a camera I’m confident in and with means I’ve taken a ton of pictures. Mostly of my son, but that gives me something else to work on: expanding the subjects and events I’m comfortable shooting.

  1. See also. See also.  ↩

  2. I’ve linked to the intenational SKU, which is $150 less expensive, but carries no warranty. The Olympus-warrantied SKU is here, if you prefer. I couldn’t find an international SKU for the 17mm. I should note that I actually got a pretty good deal on the warrantied SKU, so I bought that one.  ↩

Updating rsync on OS X

I recently needed to move a few directorys of tens- or hundreds-of-thousands of files to my Synology. Perfect use for rsync!

Unfortunately, rsync on OS X is stuck at 2.6.9, and I wanted to take advantage of some of the new features of 3.1.0. Specifically better handling of OS X metadata, and progress indication.

Fortunately, this walks you through a quick build and installation of rsync 3.1.0 in your /usr/local/bin folder. The benefit is that you can install your new version alongside the OS X included version (which is installed in /usr/bin/). Then you can add a couple aliases to your .bash_profile to treat them appropriately.

My aliases are as follows:

alias oldrsync="/usr/bin/rsync"
 alias rsync="/usr/local/bin/rsync"
 alias nrsync="/usr/local/bin/rsync -a  --info=progress2"

The first makes the system-installed 2.6.9 version refrencable by using the command oldrsync. The second makes the version in /usr/local/bin/ (which is 3.1.0, in my case) the one that runs when I type rsync. Finally, the third references the new version of rsync with a couple flags I almost always use.

Source: http://selfsuperinit.com/2014/01/04/an-upd...

Grantland: The Tale of Two Flaccos

To recap: durability, pretty deep balls AND timely pass-interference penalties! You wouldn’t call it the sexiest quarterback package, and you certainly wouldn’t feel great about paying one of the league’s most lucrative ransoms for it. […]

Then again, the Ravens weren’t paying for Joe Flaccid. They were paying for Joe Flacco. You know, the calm dude from the playoffs. The towering, smiling, handsome, lanky, confident, gunslinging, teaser-killing, flag-generating, deep-ball-flinging machine. This version of Flacco suffered growing pains: seven straight road playoff battles; in the first five, he had no 200-yard games, one touchdown and six picks, three wins and two losses, and two season-ending stink bombs against the 2008-09 Steelers (three picks) and 2009-10 Colts (two picks, three points total). He looked better in his third postseason appearance (two more road games, including a blowout win in K.C.), then blossomed the following winter when deep threat Torrey Smith showed up. Flacco’s seven-game playoff stretch from January 2012 through last weekend kinda sorta maybe backs up John Harbaugh’s claim that Flacco is “the best quarterback in football.”

Bash on Synology DiskStation

I picked up a Synology DS415play, a couple weeks ago. I’ve been looking for a good way to store family memories, as well as have device-indepentet on-site storage and redundancy.

The Synology DiskStation is a Linux-based NAS, and I couldn’t stand that the default shell was ash. How to fix?


I’m taking a lot from this post on the Synology forums, and trying to explain it in a litte more detail[1].

Finding the bootstrap file for the Intel Atom CE5335 was a bit of a challenge, since Synology doesn’t use it as widely used as some other CPUs in their lineup. Fortunately the thread I linked above has a relatively recent (Nov 2014) bootstrap for a DS214play, which uses the same CPU. I guessed it would be the same, and it was.

I’m going to assume that if you’re reading this, you are thinking of doing the same on your DiskStation, and that you have an ill-defined-but-higher-than-zero knowledge of both *nix systems and how to Google.

First, you’ll need to ssh into the NAS as root (root’s password is the same as the admin user password).

You’ll need to execute the following commands, command by command:

$ cd /volume1/@tmp  
$ wget http://ipkg.nslu2-linux.org/feeds/optware/syno-i686/cross/unstable/syno-i686-bootstrap_1.2-7_i686.xsh  
$ chmod +x syno-i686-bootstrap_1.2-7_i686.xsh  
$ sh syno-i686-bootstrap_1.2-7_i686.xsh  
$ rm syno-i686-bootstrap_1.2-7_i686.xsh

Line by line, the above does the following:

  1. Change into the Synology’s temp directory
  2. Download the bootstrap script
  3. Make the script executable
  4. Run the script
  5. Remove the script

At this point you have ipkg, the package manager, installed, but your shell doesn’t know about the folder it’s installed in. You’ll need to add /opt/bin/ and /opt/sbin/ to the PATH[2] in your .profile.

While you’re in your .profile, you might see a line that says HOME=/root/. I changed mine to HOME=~/, since I want this .profile to be portable between users. I’ve copied it to the admin user when I finished, so I have the same experience when I’ve connected as admin and root.

Now, if you type ipk and hit [tab] it should autocomplete to ipkg.

So, let’s install bash:

$ ipkg install bash`

Bam. You’ll get some output, then you should have bash installed.

Now, I needed conventions for getting into bash when I connect to the device. Again, the Synology forums came to the rescue.

Add the following to the end of your .profile:

if [[ -x /opt/bin/bash ]]; then   
    exec /opt/bin/bash  

The above checks if /opt/bin/bash is executable. If it is, the command will execute /opt/bin/bash. If it is not, it doesn’t execute /opt/bin/bash, therefore leaving you in ash.

Add the following to .bashrc:

PS1='\u@\h:\W \$ '  
export SHELL=/opt/bin/bash

The top line you may want to adjust to your taste. That’s how I like my prompt to look. Here’s a good tool to help you build the prompt that suits you.

The second line sets the SHELL variable to /opt/bin/bash. Remember that .bashrc is only read by bash when bash is started, so the SHELL only gets set if bash is called.

Now before you close your current SSH session, start a second. You should get your new, fancy bash prompt. Success!

Once you have that good feeling, copy .profile and .bashrc to /volume1/homes/admin/, and start another ssh session, this time connecting as admin. If that works, you’re set.

  1. I find that if I start a project like this by thinking about (and sometimes outlining) a post like this as I go, I have a better understanding of what I’m doing. Often, if I can’t follow the thread from beginning to the end, I don’t actually begin the project because I feel like I don’t understand the process well enough.  ↩

  2. Here’s an explanation of the PATH variable in UNIX, if you’re not familiar.  ↩

The Origins of the HTML blink Tag

Lou Montulli in an old (but it seems to be impossible to find out how old) post on what appears to be a kind of personal website:

At some point in the evening I mentioned that it was sad that Lynx was not going to be able to display many of the HTML extensions that we were proposing, I also pointed out that the only text style that Lynx could exploit given its environment was blinking text. We had a pretty good laugh at the thought of blinking text, and talked about blinking this and that and how absurd the whole thing would be. The evening progressed pretty normally from there, with a fair amount more drinking and me meeting the girl who would later become my first wife.

  1. Shocker.
  2. It seems pretty sad that someone so involved with developing the web as we know it has this for a website.

Song Exploder - Episode 24: Tycho

The only bad thing about Song Exploder is that I've spent a lot more money on music I otherwise might not have found. Woe is me.

Be sure not to miss the year-end episode Sea of Love by the National. You can go through the year-long archives and find plenty of gold. The access that Hrishikesh Hirway has been able to gain over 52 weeks is a great example of how thoughtful, interesting, and insightful commentary opens doors.

Modern Data Storage

Since earlier this year, I’ve been making a focused effort to ensure that my and my family’s important data is safe. It’s the closest I ever get to making a New Year’s resolution[1]. When I picked up my 2008 MacPro, a couple years ago, I built my own Fusion Drive, but also threw in a second, larger spinning disk drive for internal Time Machine backups.

Obviously, a single copy is not a backup, and for years I have kept all of my documents in Dropbox. That’s a second copy of most stuff. I signed up for BackBlaze earlier this year as well. That captures everything Dropbox does, plus the few things on my MacPro’s hard drive[2], that aren’t in Dropbox.

On-Site Improvements

I recently added a Synology NAS to the mix. On-site, large storage (with multi-drive redundancy), including multiple user accounts, various web services, and slew of other features made it very appealing. I picked the DS415play, because of the hardware video transcoder, and hot-swappable drives[3].

The Synology also allows each user account to sync their Dropbox (among other cloud storage providers) to a folder in their user home directory, it seemed like a nice way to have a second on-site copy of all of my and Lindsay’s docs.

In addition, the PhotoStation feature will help me solve the issue I’ve been struggling with: how do I make sure that Lindsay and I both have access to our family photos, consistently, effortlessly, and without relying on an intermediary cloud service. Neither of us are interested in uploading all of our photos to Facebook or Flickr just to share them. It also takes thought, effort, and coordination on our parts to get photos our of our Photo Streams give them to one another. I want to minimize that, while ensuring that these photos are well backed up.

Unfortunately, there’s no package to backup my Synology to BackBlaze, and Marco had an article that highlighted his issues trying to make that work, and he wound up settling on Crash Plan. I’ll likely do the same.

Expect posts in the coming weeks about how I’m messing with this stuff. I’ve found it to be a lot of fun, already, and I’m pretty impressed with the Synology. It’s a little fiddly for most people, but if you’re inclined to be a nerd - especially a Unix-y nerd - it’ll be right up your alley.

  1. I’m a big fan of the idea that if something is important to you, you should be doing it already.  ↩

  2. You might have noticed that I’m pretty focused on backing up my MacPro, and I’m much less worried about my rMBP. There are three reasons for this: first, My MacPro has all of my family photos in Aperture libraries that are too big to go into Dropbox; second, everything on the rMBP is in Dropbox, thanks to Dropbox for Business’ ability to sign the app into a work account, and a personal account. I symlinked ~/Documents/ to ~/Dropbox (Under Armour)/Documents/, but I stil have access to my personal dropbox at ~/Dropbox (Personal)/. The only things that aren’t in there are my Downloads folder (which could be easily, and arguably should be), and my ~/Sites/ which I really only use for Cheaters and as a repository for various software and configs routers, switches, WAN Optimization devices, and can be discarded at will.  ↩

  3. Synology’s feature matrix is a bit of a mess, but eventually I decided that hot-swappable drives was a must, which took me up to the DS415, and adding the transcoder was an additional $60, so that made the cut, but each person’s needs are going to be a little different.  ↩

Thank You, Tim Cook

Casey Newton on the Verge:

So "move on," if Cook’s essay today makes you so uncomfortable. Return to talking about his fastidiousness, or his supply-chain management, or whatever. But there’s no moving on for me, not today. This I’m going to savor.

So should we all.

Stuff You Missed in History Class - Beast of Gevaudan

This is the perfect pre-Halloween podcast. Grisly deaths; heroic figures, people who attempt to rise to "hero" status, but make themselves look like fools; supposedly supernatural monsters; all in late 18th century France.

And, the best part? After you listen, you can watch Brotherhood of the Wolf with a fire and a glass of wine. That's exactly how I'd like to lead into Halloween.

Unblock-Us and Netflix Update

Nick wrote in with a good note regarding my Unblock-Us + BIND setup:

I noticed after setting up the netflix.com zone that unblock.us resolved most of the Netflix addresses to a CNAME, e.g.:

 secure.netflix.com.   86400   IN  CNAME   secure-1848156627.us-west-9.elb.amazonaws.com.

My ISP’s DNS server did not know about the address secure-1848156627.us-west-9.elb.amazonaws.com, but the unblock.us DNS server resolved it successfully. So I just added another zone for amazonaws.com, and forwarded those requests to unblock.us. That seems to have resolved it - Netflix now works. Not ideal, since the rule is a bit general, but I’m happy to have it working.

Good investigation, and little things like this may resolve some of the issues I was seeing with this setup, last year. I don't have the patience to keep up with it, but I'm certain some of you are more patient people than I am.

The Horror of a 'Secure Golden Key'

An extraordinarily clear and understandable post by Chris Coyne that explains exactly what's wrong with the idea that by protecting our data, Apple (and Google, and other service providers) are only serving to protect the guilty. In fact, they're protecting us all, and in many ways.

Beyond all the technical considerations, there is a sea change in what we are digitizing.

We whisper “I love you” through the cloud. We have pictures of our kids in the bath tub. Our teens are sexting. We fight with our friends. We talk shit about the government. We embarrass ourselves. We watch our babies on cloud cameras. We take pictures of our funny moles. We ask Google things we might not even ask our doctor.

Even our passing thoughts and fears are going onto our devices.

Time was, all these things we said in passing were ephemeral. We could conveniently pretend to forget. Or actually forget. Thanks to the way our lives have changed, we no longer have that option.

This phenomenon is accelerating. In 10 years, our glasses may see what we see, hear what we hear. Our watches and implants and security systems of tomorrow may know when we have fevers, when we're stressed out, when our hearts are pounding, when we have sex and - wow - who's in the room with us, and who's on top and what direction they're facing*. Google and Apple and their successors will host all this data.

We're not talking about documents anymore: we're talking about everything.

You should be allowed to forget some of it. And to protect it from all the dangers mentioned above.

As I increasingly use my various devices as an outboard brain (which I do, a lot), I need things to be ephermal. I need to be able to tell my outboard brain to forget stuff with only slightly more difficulty than my real brain forgets stuff. And I want to know that eg the NSA isn't creeping on stuff I've already forgotten.

Cheaters to launch an SSH Session

I'm a stalwart Terminal fan for my Engineering tasks. I don't understand why so many colleagues prefer a Terminal emulator like SecureCRT when we have native SSH built right into the OS. Something a lot of SecureCRT guys hold over my head is the nested folders with saved SSH sessions.

It dawned on me this morning that I could duplicate that functionality in something I'm already using: Brett Terpstra's Cheaters.

I won't get into an in-depth review of Cheaters, here. Simply put, it's a small app that launches a web view of a locally-hosted set of websites. Brett's suggestion is to use it as a place to keep cheatsheets (hence the name), like a virtual cubicle wall.

I used a little grep and sed on our existing hosts file, and came up with a Markdown list of links to the hostnames of our devices, using the following syntax:


I spent a couple minutes sorting the list into a reasonable hierarchy, then I used this nice little tutorial to create expanding lists using CSS and jQuery. I ran my Markdown list through Brett's own Marked 2, and copied the HTML to a new cheatsheet.

I took the CSS and javascript from the tutorial linked above, and dropped them into the appropriate folders inside of my Cheaters folder. In my new cheatsheet, I linked the specific CSS and javacscript:

<head data-preserve-html-node="true">
    <link data-preserve-html-node="true" rel="stylesheet" href="css/expandolist.css">
    <script data-preserve-html-node="true" type="text/javascript" src="js/expandolist.js"></script>

I didn't need to worry about jQuery, since Cheaters already uses it. I added the appropriate ID's to the div that holds the list, and to the first ul element. That's really all there was to it. Now I have a nice, organized, expandable list that lives in my menubar, which I can use to launch SSH sessions right in Terminal without having to remember specific hostnames. Not bad for 45 minutes of effort.

My Cheaters SSH list

My Cheaters SSH list