Breaking Bash
Recently I managed to break the bash shell in an interesting and puzzling way. The initial symptoms were very frustrating: a workflow process we use here (creating a development camp) failed for me, but for no one else. That was at least a clue that it was me, not the workflow process.
Eventually, I narrowed down the culprit to the “grep” command (and that was more through luck than steadfast Sherlock-like detective work).
$ grep foo bar
grep: foo: No such file or directoryEh? grep is misparsing the arguments! How does that happen?
So I began to study my bash environment. Eventually I came up with this fascinating little typo:
export GREP_OPTIONS='-color=auto'That’s supposed to be:
export GREP_OPTIONS='--color=auto'but it got recorded in my .bashrc as a em-dash, not a double-dash. (My guess is that I cut-and-pasted this from a web page where someone over-helpfully “typeset” this command.)
Ironically, this typo is innocuous under Bash 3.x, but when you slot it into a Bash 4.x installation, all heck busts loose.
shell
PostgreSQL Point-in-time Recovery: An Unexpected Journey
With all the major changes and improvements to PostgreSQL’s native replication system through the last few major releases, it’s easy to forget that there can be benefits to having some of the tried-and-true functionalities from older PostgreSQL versions in place.
In particular, with the ease of setting up hot standby/streaming replication, it’s easy to get replication going with almost no effort. Replication is great for redundancy, scaling, and backups. However, it does not solve all potential data-loss problems. For best results it should be used in conjunction with Point-in-time Recovery (PITR) and the archiving features of PostgreSQL.
Background
We recently had a client experience a classic blunder with their database, namely, that of performing a manual UPDATE of the database without wrapping it in a transaction block and validating the changes before committing. The table in question was the main table in the application, and the client had done an unqualified UPDATE, unintentionally setting a specific field to a constant value instead of targetting the specific row they thought they were going for.
Fortunately, the client had backups. Unfortunately the backups themselves …
database postgres replication disaster-recovery
Design for the Quotidian. Build for the 100-Year Flood.

After years of building software, I’ve crossed over to the design side and spent the last months thinking about UI/UX. In doing so I’ve come to realize that as disciplines, engineering and designing are far afield. Software development challenges you to think of the worst and least likely, whereas design asks you to consider the everyday essence of the thing.
Floods are uncommon, otherwise someone would have knowingly built your home in a river. Floods are what software developers call edge cases, uncommon happenings, and they are the subject of worry for software developers. They are what we build for. What happens when a user accidentally dumps Moby Dick into the field labeled “First Name”? Or, what happens when the hackers come calling; what weakness will they find?
As users of software, people mostly behave as expected, but not always. Like a city planner preparing for a 100-year flood, software developers carefully consider unlikely events—especially when building for large organizations, the Internet, or for a long lifespan. In the fullness of time, all events inhere. The unusual will happen, which is to say that software not engineered for edge cases is eventually hacked, …
design development
MediaWiki major upgrade process
Keeping your MediaWiki site up to date with the latest version is, like many sysadmin tasks, a never-ending chore. In a previous article I covered how to upgrade minor revisions of MediaWiki with patches. In this one, I’ll cover my solution to doing a “major” upgrade to MediaWiki. While the official upgrade instructions are good, they don’t cover everything.
MediaWiki, like Postgres, uses a three-section version number in which the first two numbers combined give the major version, and the number on the end the revision of that branch. Thus, version 1.26.2 is the third revision (0, then 1, then 2) of the 1.26 version of MediaWiki. Moving from one major version to another (for example 1.25 to 1.26) is a larger undertaking than updating the revision, as it involves significant software changes, whereas a minor update (in which only the revision changes) simply provides bug fixes.
The first step to a major MediaWiki upgrade is to try it on a cloned, test version of your wiki. See this article on how to make such a clone. Then run through the steps below to find any problems that may crop up. When done, run through again, but this time on the actual live site. …
mediawiki
Using Google Analytics to understand and grow your business
Google Analytics, a web analytics service offered by Google, is a very handy tool for understanding your audience. It allows you to understand where traffic comes from and what resonates with your audience, which has led to Google Analytics being the most widely used web analytics service on the internet. If you understand your website’s traffic, you then have the ability to focus your website and content to optimize engagement and growth.
With Google Analytics, you have the ability to see traffic from all channels. This will lead to clear insights, and will help you understand what’s working and what’s not.
- Organic — traffic from search engines which is not paid for
- Paid Search — visitors that clicked on one of your paid advertisements (also known as Pay-Per-Click or PPC)
- Direct — visitors that typed your website address directly into the browser (includes bookmarks)
- Social — traffic from sites that are considered to be social networking sites
- Referral — visitors that arrived from 3rd party referrals
- Email — visitors that are directed from an email
- Display — visitors directed from video and display advertising
It will be helpful to walk through an example. Say you launch an …
analytics
Install WordPress on Heroku in OS X Yosemite
I wanted to install WordPress locally for my blog (about programming!), but using MAMP, XAMP or even Vagrant for this seemed overkill. I wanted a light setup. PHP and Apache are already integrated into Mac OS X, so why not use them? I wanted to deploy the app to Heroku, so that was another thing, since Heroku only provides PostgreSQL, not MySQL, out of the box. I’d like to share my research on how I did it.
WordPress with Heroku support
I found this handy WordPress template with built-in Heroku support. It has everything one needs to run WordPress on Heroku: PostgreSQL for WordPress (because MySQL on Heroku is a paid service), Amazon S3 and Cloudfront for your uploads since Heroku has an ephemeral file system, WP Sendgrid to send emails and WordPress HTTPS. Check out a copy with this command:
git clone https://github.com/mhoofman/wordpress-heroku.gitLet’s run the project locally first because a file cannot be written to Heroku’s file system, and updating and installing plugins or themes should be done locally anyways and then pushed to Heroku. I’m using PhpStorm for my PHP development.
Configuring Apache
mkdir -p ~/Sites
echo "<html><body><h1>my site …apache heroku php wordpress
Sort product attribute options by the position property in Magento
Introduction
Recently I was working with Magento 1.9.1, trying to get a list of dropdown attribute options sorted by the position property. However there is a known bug in Magento 1.9.1, where the position property is not respected.
I looked for a patch to fix this issue, however there was no official patch, and none of the available community fixes were good enough. So again, I needed to fix it by myself.
Tip! If you know how to apply a patch file it is here. If not, please continue.
Part 1
We need to overwrite some Magento core code unfortunately. The good thing is that there is a cool way of doing this in Magento so we don’t need to overwrite the files directly, we need to create a local copy.
Copy app/code/Core/Mage/Catalog/Model/Resource/Product/Type/Configurable/Attribute/Collection.php file to app/code/local/Mage/Catalog/Model/Resource/Product/Type/Configurable/Attribute/Collection.php. You need to create the whole directory structure. If you use Unix system it is simple as: (running from Magento root)
mkdir -p app/code/local/Mage/Catalog/Model/Resource/Product/Type/Configurable/Attribute/
cp …ecommerce magento
File names the same except for capitalization
Most Unix filesystems, including all the common Linux ones, are fully case-sensitive, meaning you can have two files in the same directory that differ only by case:
- a-very-nice-image.png
- a-VERY-nice-image.png
However, this is not true on Windows and Mac OS X. They will preserve your chosen capitalization, but each file name must be unique regardless of the case.
I don’t know of situations where it would be wise to have such conflicting mixed-case files even on Linux where it works fine. But for various reasons this can happen in the messy real world. If you then send those files to someone on Windows or Mac OS X in a zip file, or via Git version control, they’re going to be confused.
When unzipping, usually the last file to be extracted will overwrite the earlier one with the nearly-same name. So a file that is perhaps important will just be mysteriously gone.
When pulling in files with Git, the same thing happens, but you also immediately have an unclean working copy that Git will tell you about:
$ git status
On branch master
Your branch is up-to-date with 'origin/master'.
Changes not staged for commit:
(use "git add <file>..." to update what will be …sysadmin tips



