Skip to main content

Charities and Hackathons

Someone asked an interesting question at a charity-focused hackathon I was at last year - along the lines of "This is the third year we've held this event to help Charities, why aren't more of them here with projects?". From what I've seen of (medium to large) charities, the answer would be along the lines of "the technology problems that charities have to deal with can't easily be tackled by a hackathon". Technology problems in charities tend to be things like:

  • Needing to modify existing systems to handle new initiatives
  • Correctly piping data from one system to the other
  • Bringing order to ad-hoc systems that have evolved in Excel
Those sorts of problems are hard to take to a hackathon setting because so much knowledge of existing systems, processes and data is needed to handle them properly. Whereas a good hackathon project is generally one that mashes data from X with APIs from Y and Z to make a new self contained app.

Hackathons are great for generating experimental new apps, but Charities are not well placed to take advantage of experimental initiatives. I've seen some great prototypes get built, but they rarely seem to be picked up by charities. I think its probably because decision makers in large charities can't generally divert time and funds from established programmes at short notice to plough them into experimental ventures.This is compounded by the 'post hackathon dispersion' effect - once the hackathon is over and the pizza boxes are cleared away, the teams tend to disperse and its hard to keep up momentum.

Hence, despite Hackathons (bunch of smart people with effort to donate) and Charities (worthy causes in need of help) being - on paper - a great thing to combine together, I havent seen it done in a really integrated way yet. Charity hackathons are great for sharing ideas and trying things out, but there's still some figuring out to do, about how can we can harness that effort for maximum lasting benefit.

Comments

Popular posts from this blog

SSRS multi-value parameters with less fail

SSRS supports multi-value parameters, which is nice, but there are a few issues with them. This is how I deal with them. Two of the problems with SSRS multi-value parameters are: You have to jump through a few hoops to get them to work with stored procedures The (Select All) option, as shown above The reason the (Select All) option is a problem is that it is a really inelegant way of saying 'this parameter does not matter to me'. If you have a list with hundreds of values, passing all of them as a default option just seems wrong. Also, if your report shows the user which items they selected, printing the whole list when they choose (Select All) is excessive. So in this post I'm going to show my particular way of: Jumping through the hoops to get Multi-Value params in a stored procedure Adding a single '--All--' value that the report interprets as meaning all the options. Getting Multi-Value params to work with Stored Procedures This is

Copying data to Salesforce Sandboxes using TalenD

A common problem with Salesforce Developer Sandboxes is that they are blank. Really you're going to want some data in there, so there are various strategies for copying data from your live instance to the Sandbox. There are some paid-for solutions - SFXOrgData , Salesforce Partial Data Sandboxes - but if you've got a decent ETL tool you can build your own. There are a bunch of free ETL tools for Salesforce: JitterBit Data Loader is good for quick ad-hoc tasks but the free version makes it difficult to manage specific ETL projects or share projects with other users Pentaho Community Edition - an open source edition of the enterprise version Apatar was a free open source Salesforce ETL which still works but development seems to have stopped since 2011 TalenD Open Studio is an open source ETL tool For the task of copying data from live to a Sandbox, either Pentaho or TalenD Open Studio could be used, depending on preference. Here's a good comparison of the dif

Bug Hunter in Space

In 1987, Acorn launched the Archimedes home computer. At the time, it was the fastest desktop computer in the world, and at the time, I was fortunate enough to have one to experiment with. The Archimedes was great, but it never really took off commercially. However, it was built around the ARM processor, which Acorn had designed itself when it could not find any existing processors suitable for its 32-bit ambitions. The ARM processor was a masterpiece of simple and intuitive design, and its still around today, with most of the instruction set pretty much unchanged. In fact, you've probably got one in your pocket right now. Its design makes it process very efficiently on low energy intake, and hence it is estimated that about 98% of all mobile phones contain an ARM chip. Over 10 billion ARM chips have been shipped, and they outnumber Intel's long running x86 series of chips by a factor of about 5 to 10. I had learned programming on the BBC Model B , and when we got the A