Friday, December 19, 2014

Time Tracking from Commit Messages

Everybody hates time tracking. Mostly because it takes some time (even if it's just several minutes) to enter the TT system, enter the duration, choose the project and maybe write a comment. Programmers have it a little easier. They are used to writing comments after their work into a VCS (Mercurial, Git, etc.).

Why not combine the two. What if you could write a time you spent on the feature directly into the commit message? Some VCS support this already.

Trac (http://trac.edgewall.org/wiki/TimeTracking)

It's a plugin which is not easy to configure but you can enter the time spent in brackets like this "(1h)".

Freckle Time Tracking (http://mir.aculo.us/2009/10/12/instant-time-tracking-from-git-commit-messages/)

"So Freckle Time Tracking now comes with Github integration, which means you can instantly log time from Git commit messages when you push updates to Github."

Entering times looks like this "f:2.5". More details: https://help.letsfreckle.com/import-export-api/log-time-from-commit-messages

Redmine (http://www.redmine.org/issues/1518)

Supports it since 5 years ago. Format "time 30 m" on a new line.

JIRA (https://confluence.atlassian.com/display/AOD/Processing+JIRA+issues+with+commit+messages)

That's the largest issue tracker, of course it supports it. Format: "#time 1w 2d 4h 30m".

Rechnung+

I'm the developer of a small time-tracking system and I don't use any of the above. So I've made a small PHP script which would analyze the commit log and collect the times from it. Format: "[0.5h]" or "[22:15-23:00]". The results would look like this, but it's easy to customize.


Using all the power of millions of idle computers

Do you know how busy is your computer on average? You think 50%? Open a task manager and check yourself. Mine was busy at 2-3% only. That is having a browser, text editor and a couple other app open. Unless you are archiving or extracting an archive, your computer is almost doing nothing. It's the power wasted. Think of thoussands and millions of computers turned on but doing almost nothing. That's maybe not the case for computer professionals, but even they don't use more than 20% of what they have. Whatever you do: browsing, watchin video, listening to music, typing... Especially typing - it can do millions of operations between each two letters that you type. Why doesn't it?
In 2000, when I was graduating, one of the graduation topics was about "agents" - small applications running on multiple computers on the network performing some distributed task. It was a hot topic back then and it looked like in no time, all computers will run some kind of agent.
Now, in 2014, I can't recollect I've heard the word "agent" in years. What happened? And what can we do to revive the idea?

What are agents?

If you have head of SETI@Home or Folding@Home you should know what I mean with an "agent". It's: - a small application - running on multiple computers - receiving tasks to do from the Internet - processing something - sending results back to the Internet server.
This effectively allows to use the power of thousands (millions) computers to do a very complicated task which is split into smalled chunks in parallel. This is a huge potential. Imagine how much would it cost to purchase a thousand average computers, place them somewhere, connect them all, pay for the power they consume and maintenance. This is what companies like Google do. They have the money, many other organizations don't.

My suggestion

Programming a new system like SETI@Home is not a simple task. Redoing it again for another project is a waste of time and not DRY(http://wikipedia.org/wiki/DRY). But most complicated is to enlist users to download and install one more client on their computers. Why not having a general agent system which is capable of running any task? Users only need to install it once, but the participating organizations would only need to program the actual computational algorithms (about 20%) - not the 80% of the infrastructural code. This will allow smaller ogranizations to gain access to free computer power - benefiting the society. Users only need to install an agent once and rest assured they are helping the world by just not turning their computer off.
It should be written in Java. I don't like Java so much, but there's no other language which can be run on almost any computer that exists. It also features a relatively good security by isolating the running code from the operating system.
The client would be represented by a task bar icon. When you open the details it should show a project it is working on (SETI, Folding or anything else), current CPU idle percentage, CPU idle percentage if no task was running, total processing time spent since the installation and some kind of reward points collected for donating your computer. It should detect when computer needs some extra power for doing user-initiated task and pause the task processing so that users don't feel that their computer is any slower than before.
A server should be able to maintain a list of projects and hold the source and processed data for the project. The results may be obtained by the project's owner organization. A website of the project can show the total processing power in use (FLOPS).
BTW, do you know that the total processing power of Bitcoin mining machines have exceeded 16 petaFLOPS? The world's top ten (all ten of them) supercomputers can only do 5% of that. And Bitcoin mining is NOT what your average neigbour is doing. The amount of processing power being wasted is just mind blowing. The cure for cancer maybe found just 10, 100, 1000 times faster. Imagine a thousand other projects which are NOT done because of lack of computer power.
Would you allow your computer to do some research while you are browsing?