Code Debt-Free
Every now and then, the opportunity arises to write debt-free code (meaning free of technical debt). When such opportunities come, we must seize them.
I recently had the distinct pleasure of cranking out some Perl modules in the following order:
-
Write documentation for the forthcoming functionality
-
Implement unit tests for the aforementioned forthcoming functionality
-
Verify that the unit tests fail
-
Implement the awaited functionality
-
Verify (jumping back to step 4 as necessary) that the unit tests work.
Timelines, interruptions, and other pressures often get in the way of this short-term development cycle. The cycle can feel tedious; it makes the task of implementing even simple functions seem unpleasantly large and drawn out. When an implementation approach flashes into the engineer’s mind, leaping to step 4 (implementation) feels natural and immediately gratifying. The best-intentioned of us can fall into this out of habit, out of inertia, out of raw enthusiasm.
Documentation, though, demonstrates that you know what you’re trying to achieve. It is not a nicety, it is proof that you understand the problem at hand. Unit tests, as hard as they can sometimes be to …
perl tips
MySQL vs. PostgreSQL mailing list activity
My co-worker, Greg Sabino Mullane, noted this writeup on the MarkMail blog comparing the amount of traffic on the various MySQL and PostgreSQL mailing lists.
I suppose you could pessimistically say that PostgreSQL users need more community help than MySQL users do, but reviewing the content of the traffic (and going from years of personal experience) doesn’t support such a view. The PostgreSQL community seems to have more long-term, deeply involved users who are also contributors.
But let’s hope the competition in the free database world picks up. It looks like the new Drizzle project has a good chance of growing a new community around MySQL.
In any case, the MarkMail mailing list archive and search service is an excellent resource. Thanks, MarkMail folks!
database postgres
Git push: know your refspecs
The ability to push and pull commits to/from remote repositories is obviously one of the great aspects of Git. However, if you’re not careful with how you use git-push, you may find yourself in an embarrassing situation.
When you have multiple remote tracking branches within a Git repository, any bare git push invocation will attempt to push to all of those remote branches out. If you have commits stacked up that you weren’t quite ready to push out, this can be somewhat unfortunate.
There are a variety of ways to accommodate this:
- use local branches for your commits, only merging those commits into your remote tracking branches when you’re ready to push them out;
- push remote tracking branches out whenever you have something worth committing.
However, even with sensible branch management practices, it’s worthwhile to know exactly what it is you’re pushing. Therefore, if you want to have a sense of what you’re potentially doing in calling a bare git push, always call it with the –dry-run option first. This will show you what a the push will send out, where the conflicts are, and so on, all without actually performing the push.
It is ultimately best, though, to understand the …
git
Building Perl on 64-bit RHEL/Fedora/CentOS
When building Perl from source on 64-bit Red Hat Enterprise Linux, Fedora, CentOS, or derivatives, Perl’s Configure command needs to be told about the “multilib” setup Red Hat uses.
The multilib arrangement allows both 32-bit and 64-bit libraries to exist on the same system, and leaves the “non-native” 32-bit libraries in /lib and /usr/lib while the “native” 64-bit libraries go in /lib64 and /usr/lib64. That allows the same 32-bit RPMs to be used on either i386 or x86_64 systems. The downside of this is that 64-bit applications have to be told where to look for, and put, libraries, or they usually won’t work.
For Perl, to compile from a source tarball with the defaults:
./Configure -des -Dlibpth="/usr/local/lib64 /lib64 /usr/lib64"
Then build as normal:
make && make test && sudo make install
I hope this information will come in handy for someone. I believe I learned it from Red Hat’s source RPM for Perl.
perl redhat
Perl incompatibility moving to 5.10
We’re preparing to upgrade from Perl 5.8.7 to 5.10.0 for a particular project, and ran into an interesting difference between the two versions.
Consider the following statement for some hashref $attrib:
use strict;
...
my ($a, $b, $c) = @{%{$attrib}}{qw(a b c)};
In 5.8.7, the @{…} construct will return a slice of the hash referenced by $attrib, meaning that $a gets $attrib->{a}, $b gets $attrib->{b}, and so on.
In 5.10.0, the same construct will result in an error complaining about using a string for a hashref.
I suspect it’s due to the hash dereference (%{$attrib}) being fully executed prior to applying the hash-slice operation (@{…}{qw(a b c)}), meaning that you’re not operating against a hashref anymore.
Fortunately, the fix is wonderfully simple and significantly more readable:
my ($a, $b, $c) = @$attrib{qw( a b c )};
The “fix”—which is arguably how it should have been constructed in the first place, but this is software we’re talking about—works in both versions of Perl.
perl
Signs of a too-old Git version
When running git clone, if you get an error like this:
Couldn’t get http://some.domain/somerepo.git/refs/remotes/git-svn for remotes/git-svn
The requested URL returned error: 404 error: Could not interpret remotes/git-svn as something to pull
You’re probably using a really old version of Git that can’t handle some things in the newer repository. The above example was from Git 1.4.4.4, the very old version included with Debian Etch. The best way to handle that is to use Debian Backports to upgrade to Git 1.5.5.
On Red Hat Enterprise Linux, Fedora, or CentOS, the Git maintainers’ RPMs usually work (though you may need to get a dependency, the perl-Error package from RPMforge).
If all else fails, grab the Git source and build it. I’ve never had a problem building the code anywhere, though building the docs requires a newer version of asciidoc than is easy to get on RHEL 3.
git
RailsConf 2008 Report
End Point’s Sean Schofield recently returned from a trip to RailsConf 2008. RailsConf is the annual gathering of Rails developers which was held for a second year in a row in Portland, Oregon. There is also now a European version of the conference which is held during the fall. The conference consisted of one day of tutorials, followed by three days of sessions and keynotes.
Attendance was extremely heavy this year (over 1,800 people) which caused some initial crowding issues with the sessions. Fortunately, these issues were resolved by the second day and the conference organizers even managed to schedule a repeat of the first day’s talks for those who were initially shut out.
Several notable Rails personalities were speaking at the conference this year, including David Heinemeier Hansson, the creator of Rails. Other speakers of note included Obie Fernandez (author of The Rails Way), Joel Spolsky and David Chelimsky (of RSpec fame).
Scott Chacon gave an interesting talk entitled Using Git to Manage and Deploy Rails Apps. The Git distributed source code management system has been taking the Rails world by storm this year. Two factors have contributed to this explosion in popularity. …
conference rails
PGCon 2008 Report
End Point’s Greg Sabino Mullane recently returned from PGCon 2008, the annual conference for the PostgreSQL database project. The conference was held in Ottawa, Canada, and is a mix of Postgres developers, companies who are using Postgres, students, and everyone else involved in the vibrant Postgres community.
Greg presented a talk on Bucardo, the multi-master and master-slave replication system for Postgres. He explained the strengths and weaknesses of Bucardo, its typical use cases, and described in detail how it works. He detailed the innovative use of “hooks”, which allow custom code to be fired at any point throughout the replication process. The hooks are also the method of doing conflict resolution and exception handling, two important factors for multi-master replication. He also discussed the use of DBIx::Safe to pass restricted database handles to the custom code, as well as future directions for the Bucardo project. The talk even ended on time and left time for questions. The slides are available on the PGCon 2008 site.
The other talk Greg gave was a “lightning talk” on DBIx::Cache, a query caching system for Postgres built on top of DBI and DBD::Pg. (Lightning talks are …
conference database open-source postgres bucardo perl