• Home

  • Custom Ecommerce
  • Application Development
  • Database Consulting
  • Cloud Hosting
  • Systems Integration
  • Legacy Business Systems
  • Security & Compliance
  • GIS

  • Expertise

  • About Us
  • Our Team
  • Clients
  • Blog
  • Careers

  • VisionPort

  • Contact
  • Musica Russica Launches with Piggybak

    Steph Skardal

    By Steph Skardal
    September 3, 2012

    The new home page for Musica Russica.

    Last week, we launched a new site for Musica Russica. The old site was running on an outdated version of Lasso and Filemaker and was approximately 15 years old. Although it was still chugging along, finding hosting support and developers for an outdated platform becomes increasingly challenging as time goes on. The new site runs on Ruby on Rails 3 with Nginx and Unicorn and uses open source Rails gems RailsAdmin, Piggybak, CanCan and Devise. RailsAdmin is a great open source Rails Admin tool that I’ve blogged about before (here, here, and here). Piggybak is End Point’s home grown light-weight ecommerce platform, also blogged about several times (here, here, and here). Below are a few more details on the site:

    • The site includes Rails 3 goodness such as an elegant and thorough MVC architecture, advanced routing to encourage clean, user-friendly URLs, the ability to integrate modular elements (Piggybak, RailsAdmin) with ease, and several built-in performance options. The site also features a few other popular Rails gems such as Prawn (for printing order and packing slip PDFs), Rack-SSL-Enforcer (a nice tool for enforcing SSL pages), …

    clients ecommerce piggybak rails

    DevCamps: Creating new camps from a non-default Git branch

    Brian Gadoury

    By Brian Gadoury
    August 31, 2012

    I recently set up part of a new Rails project DevCamps installation with a unique Git repo setup and discovered a trick for creating camps from a Git branch other than master. Admittedly, the circumstances that led to me discovering this trick are a bit specific to this project, but the trick itself can be useful in other situations as well.

    The Git repo specified in local-config had a master branch with nothing in it but the standard “initial commit.” This relatively new project uses a simplifed git-flow workflow and as such, all its code was still in the “develop” branch.

    In my case, this empty-ish master branch meant there were no tracked files in CAMP_PATH/public directory. This meant that Git did not create that directory when the repo is cloned by mkcamp. This meant that apache2 would refuse to start. Camping without a web server makes my back hurt, so I snooped around a little bit…

    I discovered two things:

    1. You can tell git clone which branch to checkout initially by passing it a ‘–branch $your_non_default_branch’ switch
    2. The mkcamp command will happily pass that switch (as well as any other spicy options you include) along to …

    camps git hosting

    Automatically kill process using too much memory on Linux

    Jon Jensen

    By Jon Jensen
    August 30, 2012

    Sometimes on Linux (and other Unix variants) a process will consume way too much memory. This is more likely if you have a fair amount swap space configured — but within the range of normal, for example, as much swap as you have RAM.

    There are various methods to try to limit trouble from such situations. You can use the shell’s ulimit setting to put a hard cap on the amount of RAM allowed to the process. You can adjust settings in /etc/security/limits.conf on both Red Hat- and Debian-based distros. You can wait for the OOM (out of memory) killer to notice the process and kill it.

    But all those remedies don’t help in situations where you want a process to be able to use a lot of RAM, sometimes, when there’s a point to it and it’s not just in an infinite loop that will eventually use all memory.

    Sometimes such a bad process will bog the machine down horribly before the OOM killer notices it.

    We put together the following script about a year ago to handle such cases:

    It uses the Proc::ProcessTable module from Perl’s CPAN to do the heavy lifting. We invoke it once per minute in cron. If you have processes eating up memory so quickly that they bring down the machine in less than a …


    hosting linux perl

    Git: Delete your files and keep them, too

    Jeff Boes

    By Jeff Boes
    August 30, 2012

    I was charged with cleaning up a particularly large, sprawling set of files comprising a git repository. One whole “wing” of that structure consisted of files that needed to stay around in production (they were various PDFs, PowerPoint presentations, and Windows EXEs that were only ever needed by the customer’s partners, and downloaded from the live site – our developer camps never wanted to have local copies of these files, which amounted to over 280 MB (and since we have dozens of camps shadowing this repository, all on the same server, this will save a few GB at least).

    I should point out that our preferred deployment is to have production, QA, and development all be working clones of a central repository. Yes, we even push from production, especially when clients are the ones making changes there. (Gasp!)

    So: the aim here is to make the stuff vanish from all the other clones (when they are updated), but to preserve the stuff in one particular clone (production). Also, we want to ensure that no future updates in that “wing” are tracked.

    # From the "production" clone:
     $ cd stuff
     $ git rm -r --cached .
     $ cd ..
     $ echo "stuff" …

    git

    Company Update August 2012

    Zed Jensen

    By Zed Jensen
    August 24, 2012

    Everyone here at End Point has been busy lately, so we haven’t had as much time as we’d like to blog. Here are some of the projects we’ve been knee deep in:

    • The Liquid Galaxy Team (Ben, Adam, Kiel, Gerard, Josh, Matt) has been working on several Liquid Galaxy installations, including one at the Monterey Bay National Marine Sanctuary Exploration Center in Santa Cruz, and one for the Illicit Networks conference in Los Angeles. Adam has also been preparing Ladybug panoramic camera kits for clients to take their own panoramic photos and videos. The Liquid Galaxy team welcomed new employees Aaron Samuel in July, and Bryan Berry just this week.
    • Brian B. has been improving a PowerCLI script to manage automated cloning of VMware vSphere virtual machines.
    • Greg Sabino Mullane has been working on various strange PostgreSQL database issues, and gave a riveting presentation on password encryption methods.
    • Josh Tolley has been improving panoramic photo support for Liquid Galaxy and expanding a public health data warehouse.
    • David has been at work on a web-scalability project to support customized content for a Groupon promotion, while continuing to benefit from nginx caching. He has also been …

    company

    Paginating API call with Radian6

    Marina Lohova

    By Marina Lohova
    August 24, 2012

    I wrote about Radian6 in my earlier blog post. Today I will review one more aspect of Radian6 API - call pagination.

    Most Radian6 requests return paginated data. This introduces extra complexity of making request several times in the loop in order to get all results. Here is one simple way to retrieve the paginated data from Radian6 using the powerful Ruby blocks.

    I will use the following URL to fetch data:

    /data/comparisondata/1338958800000/1341550800000/2777/8/9/6/

    Let’s decypher this.

    • 1338958800000 is start_date, 1341550800000 is end_date for document search. It’s June, 06, 2012 - July, 06, 2012 formatted with date.to_time.to_i * 1000.

    • 2777 is topic_id, a Radian6 term, denoting a set of search data for every customer.

    • 8 stands for Twitter media type. There are various media types in Radian6. They reflect where the data came from. media_types parameter can include a list of values for different media types separated by commas.

    • 9 and 6 are page and page_size respectively.

    First comes the method to fetch a single page.

    In the Radian6 wrapper class:

    def page(index, &block)
      data = block.call(index) 
      articles, count = data['article'], data[ …

    rails api

    Merging Two Google Accounts: My Experience

    Steph Skardal

    By Steph Skardal
    August 21, 2012

    Before I got married, I used a Gmail account associated with my maiden name (let’s call this account A). And after I got married, I switched to a new gmail address (let’s call this account B). This caused daily annoyances as my use of various Google services was split between the two accounts.

    Luckily, there are some services in Google that allow you to easily toggle between two accounts, but there is no easy way to define which account to use as the default for which service, so I found myself toggling back and forth frequently. Unfortunately, Google doesn’t provide functionality to merge multiple Google accounts. You would think they might, especially given my particular situation, but I can see how it’s a bit tricky in logically determining how to merge data. So, instead, I set off on migrating all data to account B, described in this email.

    Consider Your Google Services

    First things first, I took at look at the Google Services I used. Here’s how things broke down for me:

    • Gmail: Account A forwards to account B. I always use account B.
    • Google+: Use through account A.
    • Google Analytics: Various accounts divided between account A and account B. …

    tools

    Using Different PostgreSQL Versions at The Same Time.

    Szymon Lipiński

    By Szymon Lipiński
    August 20, 2012

    When I work for multiple clients on multiple different projects, I usually need a bunch of different stuff on my machine. One of the things I need is having multiple PostgreSQL versions installed.

    I use Ubuntu 12.04. Installing PostgreSQL there is quite easy. Currently there are available two versions out of the box: 8.4 and 9.1. To install them I used the following command:

    ~$ sudo apt-get install postgresql-9.1 postgresql-8.4 postgresql-client-common

    Now I have the above two versions installed.

    Starting the database is also very easy:

    ~$ sudo service postgresql restart
     * Restarting PostgreSQL 8.4 database server   [ OK ]
     * Restarting PostgreSQL 9.1 database server   [ OK ]

    The problem I had for a very long time was using the proper psql version. Both database installed their own programs like pg_dump and psql. Normally you can use pg_dump from the higher version PostgreSQL, however using different psql versions can be dangerous because psql uses a lot of queries which dig deep into the PostgreSQL internal tables for getting information about the database. Those internals sometimes change from one database version to another, so the best solution is to use the psql from the …


    postgres ubuntu
    Previous page • Page 136 of 222 • Next page