Building CandiData

This past weekend, my colleague and friend Sandy Smith participated in Election Hackathon 2012 (read his take of the hackathon). We built our first public Musketeers.me product, Candidata.me. This was my first hackathon, and it was exciting and exhausting to bring something to life in little more than 24 hours. Our idea combined a number of APIs to produce a profile for every candidate running for President or Congress in the United States. The seed of the idea was good enough that we were chosen among 10 projects to present it to the group at large on Sunday afternoon.

Under the Hood and Hooking Up with APIs

We used our own PHP framework, Treb, as our foundation. It provides routing by convention, controllers, db acccess, caching, and a view layer. Along the way, we discovered a small bug in our db helper function that failed because of the nuances of autoloading.

I quickly wrote up a base class for making HTTP Get requests to REST APIs. The client uses PHPs native stream functions for making the HTTP requests, which I’ve found easier to work with than the cURL extension. The latter is a cubmersome wrapper to the cURL fucntionality.  

To be good API clients, we cached the request responses in Memcached between an hour to a month, depending on how often we anticipated the API response to change.

Sandy also took on the tedious – but not thankless – task of creating a list of all the candidates that we imported into a simpl Mysql table. For each candidate, we could then pull in information such as

  • Polling data from Huffington Post’s Pollster API, which we then plotted using jqplot. Polls weren’t available for every race, so we had to manually match available polls to candidates.
  • Basic Biographical information from govtrack.us
  • Campaign Finance and Fact Checked statements from Washington Post’s APIs.
  • Latest News courtesy of search queries to NPR’s Story Api.
  • A simple GeoIP lookup on the homepage to populate the Congressional candidates when a user loads the page

Bootstrap for UI goodness.

I used this opportunity to check out Twitter’s Bootstrap framework. It let us get a clean design from the start, and we were able to use its classes and responsive grid to make the site look really nice on tablets and smartphones too. I found it a lot more feature filled than Skeleton, which is just a responsive CSS framework and lacks the advanced UI elements like navigation, drop downs, modals found in Bootstrap.

Improvements that we could make

We’ve already talked about a number of features we could add or rework to make the site better. Of course, given the shelf life this app will have after November 6th, we may not get to some of these.

  • Re-work the state navigation on the homepage so that it plays nice with the browser’s history. We did a simple ajax query on load, but a better way to do it would be to change the hash to contain the state “http://candidata.us/#VA”, and then pull in the list of candidates. This would also only initiate the geoip lookup if the hash is missing.
  • Add a simple way to navigate to opponents from a candidate’s page.
  • Allow users to navigate to other state races from a candidate’s page.
  • Get more candidate information, ideally something that can provide us Photos of each candidate. Other apps at the hackathon had this, but we didn’t find the API in time. Sunlight provides photos for Members of Congress.
  • Pull in statements made by a candidate via WaPo’s Issue API, maybe running it through the Trove API to pull out categories, people, and places mentioned in the statement.
  • Use the Trove API to organize or at least tag latest news stories and fact checks by Category.

Overall, I’m very happy with what we were able to build in 24 hours. The hackathon also exposed me to some cool ideas and approaches, particularly the visualizations done by some teams. I wish I’d had spent a little more time meeting other people, but my energy was really focused on coding most of the time.

Please check out CandiData.me and let me know what you think either via email or in the comments below.

Start ups are for everybody

I came across Dan Crow’s insights into startups and older workers this morning, and I couldn’t stop myself from nodding in agreement through out the article. Part of it, surely, is that I am now closer to 40 than 30. But everything he says about the value of spending time with family, the pointlessness of working grueling hours, and the skills that come from experience had an air of “I’ve been there”.

Many startups, especially in Silicon Valley, have a macho culture of working extremely long hours. I vividly recall a long stretch of consecutive 100+ hour weeks at Apple early in my career — which came on top of a 3 hour commute to San Francisco. The quality of my work noticeably declined, and it took me months to get my focus and energy back afterwards.

It seems that both corporate america and silicon valley startups, while vastly different cultures in almost every regard, still see people as expendable resources that can be used up and replaced. Sure, if you’re working non stop for a startup, you can tell yourself that there’s a huge payoff at the end, or the chance of it. But the risk is that you spend your 20s and early 30s working forever without much to show for it. That was never something I wanted to do, and I’m lucky that I didn’t have to, either.

Why startups shouldn’t just be for the young

Virginia Tech professor cuts PC energy usage

I came across this link about a Virginia Tech professor that wrote a program to reduce energy wasted by computers. On the face of it, it sounds a bit smarter than the usual power management techniques that shut off parts or all of a computer when its not being used. It claims up to 30% reduction in power, which I expect translates into noticeably longer times between batter charges, if you use it on a laptop.

He also likens his software to cruise control in cars. But while cruise control will ease up on the accelerator as a car gathers speed when descending, it “doesn’t look ahead and say, ‘Hey, there’s a hill coming,’” Cameron says. “It just reacts.” In contrast, his program seeks out patterns to determine when a computer will need more power. He says users report electricity savings of 30 percent.

I expected the program, named Granola and distributed by miserware, to be a Windows only affair. However, its available for both Windows and various distributions of Linux. The latter likely a result of wanting to reduce server power usage, an area where Linux has a sizeable presence.  There is even a community supported, AUR package of Granola for ArchLinux. Joy!

TiVo Premiere Elites released

The TiVo community has started to dissect the latest DVR from TiVo and finds intriguing new features, along with more than 300 hours of HD recording capacity.  Another interesting find, the unit is more energy efficient, drawing about 20 Watts.

They’ve even started opening up their brand new boxes and taking photos of the guts. And a couple of new screens. The Premiere-to-Premiere streaming is working, and it even works for copy protected content that doesn’t work with Multi-Room Viewing. 

The TiVo Premiere Elite is in Customer Hands, TiVo Community Starts Analysis | Gizmo Lovers Blog

Japan’s nuclear crisis: Will it end in meltdown

Slate provides a levelheaded assessment about nuclear reactor risk and safety lessons from the crisis in Japan. I hope, once the situation is resolved, we can still look at nuclear power objectively in the USA.

To head off the next nuclear accident, we need to rethink the parameters of plant design. Why do we build backup cooling pumps for reactors but not for spent-fuel pools? And we need layers of protection that are truly independent.

Man vs. Meltdown