…and how we achieved it
This was a long weekend – again. Usually it takes working through the night to take the grand prize home at a Hackathon – we made it with getting some sleep at home, luckily.
But it was still quite a lot of work to do. If you’re not familiar with the concept of a Hackathon or if you want to contribute or tell others about the experience you had on EyeEm’s PhotoHackDay in Berlin feel free to use our storytelling tool Qurate:
an overview of social media elements taken at Photo Hackday (add your own!): https://www.qurate.de/canvas/experience/51a9b778c05d060200000017
see what I experienced: https://www.qurate.de/post/51a9bfe0e6b9260200000030#
Personally I went to the Photo Hack Day with an idea in mind that I registered on Hackersleague as “Tarantimgo”. I wanted to write an open service that offers a crowdsourced idea of what sounds and songs could match a set of pictures so photo platforms could commonly use it to offer ambient sound to their viewers (maybe a good topic for Battle Hack? Let me know if you need it 😉 ).
Then I met some good old acquaintances: Stefan, Bora, Rob, Gabi and we fell into a discussion about what to hack this weekend when Stefan Hoth (who’s a rather prominent face in Berlin and works as community and technology advocate for Google Services) mentioned a guy “running around on site and desperately looking for developers”. Since Robert, me and Gabriele have taken home an award on the last Hackathon for a “project with a promising business adaption” last time we said, ok, go get him.
Turns out, this has been Albert. He’s from Armenia – quite an exotic place somewhere between Russia, Turkey and Iran -, works in London and currently lives in Berlin. What he pitched to us was:
You find many nice images of nice places on EyeEm. If you find a professional shot of a place you can adore it, favorite it, like it but wouldn’t it be nice if you could repeat it with a personal touch? Let’s write an app that displays information about how to take that shot by yourself; including EXIF data, time and season of shooting and the right place.
We thought a little (and not everyone was convinced at first) about it. Robert and me (developers) have worked together three times on Hackathons already and I knew that Gabriele could give a helping hand when it comes to Bootstrap so we finally gave in and took the job and started the project Photoration.
We came up with some scribbles of the idea that we could agree on. What we wanted to build for the hackathon was an application that
- shows you which “sights” are nearby
- shows you “professional” photos taken from at those sites (taken from EyeEm)
- shows the position of the shooting point (the image’s GPS coordinates of the image actually) and the position of the sightseeing spot / monument on one map
- shows additional meta data for the image (ideally EXIF data)
Albert’s job was to create “final” design scribbles of these ideas. Here are his results:
We’re working with node.js and a web frontend. That’s actually a preset that I’m enforcing because I’m doing it all day and many people can adapt to it pretty fast. I set up a fresh Heroku app, added a remote to github, got Robert in within minutes and off we went. I was concentrating on the structuring and prerequisites, Robert’s task was to create a suitable backend.
We quickly found that the EyeEm API doesn’t allow searching for sightseeing places. Of course you can search for albums that match a certain Foursquare location. You can also search for venues nearby a location ( endpoint ) but that was not matching our criteria.
We decided to get those special locations by the Foursquare API which doesn’t require an user token to do that. Robert manually picked the Foursquare venue category ids that we could reuse in the Eyeem API calls later:
On first load our application
- loads foursquare venues of those category IDs nearby a location the client provides
- for each venue found we’re firing a query (yes, that’s a lot of simultaneous queries) to EyeEm’s /albums endpoint. You can provide a venue’s foursquare id to get matching photo albums.
- for each album found we’re firing another query to get its best rated photos because we’re only interested in the “highlights” that people might find worth to be copied
That’s nearly everything our backend does. It would be helpful if the API understood arrays of IDs to reduce the amount of requests that we have to send.
I don’t want to get into too much details here why I decided so but I’m using Backbone.js and a single page app approach for the frontend. I simply love this framework for its unobtrusiveness and I know that it can be used for mobile applications very well as long as you know what you’re doing. I also decided for Bootstrap 2.3.2 since it comes with responsive features. I created a Backbone router that dynamically creates its routes and points to handler methods in two Backbone.Views that I consider separate frontend classes. Using that approach I could delegate some frontend work to Robert ( he implemented parts of the single image view ) when I was getting tired around noon the second day. In the meanwhile Gabriele updated the bootstrap CSS with custom fonts and colors.
If you want to read through the code go ahead but please don’t blame me for the many bad styles and ignorances (you don’t put API keys on a public repo, do you? No you don’t !)
We came up with a fully functional but (you might’ve guessed) rather cheezy web application that runs on mobile clients. You can check it out yourself by pointing your phone to http://photoration.herokuapp.com (give the dyno some time to wake up and wait for the many API calls to return).
After the initial loading has finished you can browse through EyeEm albums of nearby sightseeing spots. If you found a photo that looks gorgeous, tap it to get some details and to see a combined map of the spot and the photo.
First, I’d like to give my warmest Thank you to my team: Robert, Albert, Gabriel. We 4 know that we couldn’t have made this possible without – us 4 🙂 Thanks a lot to the EyeEm team: you do an incredible job and you did an amazing one on that Hackday. The spirit you get at that location is far better than on any other Hackathon I ever attended. Thanks a lot to Github, Nodejs, Backbone and Bootstrap for being there and make bootstrapping applications on a weekend dead simple (if you know what you do).
Why did we win that event? A question you have to ask the jury. But personally I have the feeling that this app (even though technically there’s absolutely no rocket science involved) really does (or tries to do) something valueable: it wants to help you to take better pictures. An iPad application that automatically takes pictures of a cat or a christmas tree that blinks in certain images’ lights are funny but they don’t bring much value to the community (but rather to world, especially for cat lovers;) )
Where to go from here?
There are lots of improvements due. Massive caching of results would be a brilliant idea. A swipe-capable frontend (as scribbled by Albert) would be real nice. A photo overlay would be even nicer: lets show the chosen picture transparently over your camera preview. A selection of really good pictures would be very helpful (plus a connection to professional photo services like 500px, fotolia e.a.). Maybe this could be achieved by rating pictures in terms of “pro shot” instead of “nice one”.
If you think that this stuff has potential: feel free to connect our team members. You find us here:
More details on the project can be found on hackerleague: https://www.hackerleague.org/hackathons/photo-hack-day/hacks/photoration-app