• Home

  • Custom Ecommerce
  • Application Development
  • Database Consulting
  • Cloud Hosting
  • Systems Integration
  • Legacy Business Systems
  • Security & Compliance
  • GIS

  • Expertise

  • About Us
  • Our Team
  • Clients
  • Blog
  • Careers

  • VisionPort

  • Contact
  • Rails and SEO advantages

    Ryan Masters

    By Ryan Masters
    March 27, 2009

    In today’s climate, search engine optimization is a must to be competitive. Rails routing provides this advantage and much more.

    Descriptive, content packed URLs afford your website better search rankings because they provide a clear context as to what the page is about. Using keywords in the filename goes even further. Under normal circumstances, without advanced configuration, a web page filename is rigid and fixed. This isn’t a problem in itself, except for that it doesn’t help with SEO one bit.

    Having multiple URLs linking to the same page opens more doors to search engine crawlers. Generally, once indexed correctly, this means more access paths in to your site which in turn a result in a greater variety and volume of traffic.

    Normally in most other programming languages, you would need to use an Apache rewrite rule to accomplish this. This rule will detect a digit in a file name and pass it along as a parameter to another dynamically generated page.

    RewriteRule ^/.*([0-9]+).*$ /index.php?i=$1 [R=301,L]

    This rule is definitely probably too greedy of a match, however, it serves to illustrate the point. With that rule in place, any request containing at least one number will be …


    rails seo

    Inside PostgreSQL — Multi-Batch Hash Join Improvements

    Josh Tolley

    By Josh Tolley
    March 26, 2009

    A few days ago a patch was committed to improve PostgreSQL’s performance when hash joining tables too large to fit into memory. I found this particularly interesting, as I was a minor participant in the patch review.

    A hash join is a way of joining two tables where the database partitions each table, starting with the smaller one, using a hash algorithm on the values in the join columns. It then goes through each partition in turn, joining the rows from the first table with those from the second that fell in the same partition.

    Things get more interesting when the set of partitions from the first table is too big to fit into memory. As the database partitions a table, if it runs out of memory it has to flush one or more partitions to disk. Then when it’s done partitioning everything, it reads each partition back from the disk and joins the rows inside it. That’s where the “Multi-Batch” in the title of those post comes in—​each partition is a batch. The database chooses the smaller of the two tables to partition first to help guard against having to flush to disk, but it still needs to use the disk for sufficiently large tables.

    In practice, there’s one important optimization: after …


    postgres

    End Point: Search Engine Bot Parsing

    Steph Skardal

    By Steph Skardal
    March 25, 2009

    I’ve talked to several coworkers before about bot parsing, but I’ve never gone into too much detail of why I support search engine bot parsing. When I say bot parsing, I mean applying regular expressions to access log files to record distinct visits by the bot. Data such as the url visited, exact date-time, http response, bot, and ip address is collected. Here is a visual representation of bot visits (y-axis is hits).

    And here are the top ten reasons why search engine bot parsing should be included in search engine optimization efforts:

    #10: It gives you the ability to study search engine bot behavior. What is bot behavior after 500 error responses to a url? What IP addresses are the bots coming from? Do bots visit certain pages on certain days? Do bots really visit js and css pages?

    #9: It can be used as a teaching tool. Already, I have discussed certain issues from data generated by this tool and am happy to teach others about some search engine behavior. After reading this post, you will be much more educated in bot crawling!!

    #8: It gives you the ability to compare search engine bot behavior across different search engines. From some of the sites I’ve examined, the Yahoo bot …


    seo

    Generating sample text automatically

    JT Justman

    By JT Justman
    March 25, 2009

    It’s a classic problem: you have a template to test, or a field constraint to test, and you need a large block of text. Designers and developers have come up with many ways to generate this sample data. My favorite is the classic ‘Lorem Ipsum’ Latin text used by typesetters for hundreds of years. For a long time I’ve just copy-and-pasted it, but I just happened to find a really cool Lorem Ipsum generator, complete with the ability to specify character length, paragraph number, word count, or even make a bulleted list. Simple and stylish, and easy for less technical collaborators to use. Hit the link for some fascinating history.

    I’m sure many of you have your own methods. Share yours in the comments! Extra points for creative shell one-liners.


    tips

    Ack, grep for Developers

    Brian J. Miller

    By Brian J. Miller
    March 19, 2009

    A relatively new tool in my kit that I’ve come to use very frequently over the last 6 months or so is Ack. Notwithstanding that it is written in my preferred development language, and is maintained by a developer active in the Perl community working on some important projects, like TAP, it really does just save typing while producing Real Purdy output. I won’t go so far as to say it completely replaces grep, at least not without a learning curve and especially for people doing more “adminesque” tasks, but as a plain old developer I find its default set of configuration and output settings incredibly efficient for my common tasks. I’d go into the benefits myself, but I think the “Top 10 reasons to use ack instead of grep.” from the ack site pretty much covers it. To highlight a couple here,

    1. It’s blazingly fast because it only searches the stuff you want searched.

    2. Searches recursively through directories by default, while ignoring .svn, CVS and other VCS directories. Which would you rather type?

      • $ grep pattern $(find . -type f | grep -v '\.svn')
      • $ ack pattern
    3. ack ignores most of the crap you don’t want to search

      • VCS directories
      • *blib*, the Perl build directory …

    tips

    Git commits per contributor one-liner

    Jon Jensen

    By Jon Jensen
    March 18, 2009

    Just for fun, in the Spree Git repository:

    git log | grep ^Author: | sed 's/ <.*//; s/^Author: //' | sort | uniq -c | sort -nr
        813 Sean Schofield
         97 Brian Quinn
         81 Steph (Powell) Skardal
         42 Jorge Calás Lozano
         37 paulcc
         27 Edmundo Valle Neto
         16 Dale Hofkens
         13 Gregg Pollack
         12 Sonny Cook
         11 Bobby Santiago
          8 Paul Saieg
          7 Robert Kuhr
          6 pierre
          6 mjwall
          6 Eric Budd
          5 Fabio Akita
          5 Ben Marini
          4 Tor Hovland
          4 Jason Seifer
          2 Wynn Netherland
          2 Will Emerson
          2 spariev
          2 ron
          2 Ricardo Shiota Yasuda
          1 Yves Dufour
          1 yitzhakbg
          1 unknown
          1 Tomasz Mazur
          1 tom
          1 Peter Berkenbosch
          1 Nate Murray
          1 mwestover
          1 Manuel Stuefer
          1 Joshua Nussbaum
          1 Jon Jensen
          1 Chris Gaskett
          1 Caius Durling
          1 Bernd Ahlers

    git spree

    She sells C shells by the seashore

    Christopher Nehren

    By Christopher Nehren
    March 16, 2009

    In contrast to my previous post on tabs in vim, here’s a different way of managing multiple files, multiple SQL console sessions, multiple nearly anything. This works with any program that behaves well with regard to Unix job control, and really allows Unix to shine as an IDE in its own right. The emphasis here will be on using whatever tools are suitable to do the job, rather than on one specific editor. Note that the details given here will not work very well for network programs that assume a constant connection like an IRC client. However, at least the Postgres and MySQL consoles both support this feature, and they’re the only “networked” applications I can imagine using in this way. This post will focus more on a way of thinking than on technical know-how, though there is a bit of how-to mixed in.

    Most readers are familiar with backgrounding a task at a Unix terminal with ^Z and then bg. Something that is less common, at least in my experience (in favor of GNU screen and the like), is using shell job control for anything more than detaching a running program from one’s terminal. When applied liberally, the tactic allows one to harness the power of Unix all via one login. To …


    tips

    End Point SEO with Linkscape

    Steph Skardal

    By Steph Skardal
    March 12, 2009

    Linkscape was released in October of 2008 and is SEOmoz’s collection of index data from the web that currently contains 36 billion URLs over 225 million domains. You must have a pro membership to access advanced reporting, but without a pro membership you can access basic data such as mozRank (SEOmoz’s own logarithmic metric for page popularity) for the url, number of links to a url, number of domains to a url, and mozRank for the domain.

    For example, I ran a basic report on www.google.com and found:

    More interesting data on www.facebook.com:

    Because I haven’t given justice to describing Linkscape, please read more about Linkscape, or see Linkscape Comic for visual enhancements.

    Case Study

    In an effort to examine and improve End Point’s search engine performance, I pulled together some snippets of data from Linkscape for …


    seo
    Previous page • Page 209 of 224 • Next page