Today marks the end of the most demanding, yet interesting course I’ve ever taken at IITKGP. EC60007: Computational Neuroscience was challenging, frustrating but overall a really informative course (not a pun). Information theory overall is a great field in itself, and it was a sort of a primer to that as well. Taught by Prof. Sharba who is a really cool yet somewhat feared person, it was an example of how courses should be.
The final term project of the course was due 5 days after the end semesters (i.e. December 5, today) and no surprises here, despite repeated warnings from the professor, we had happily postponed our efforts. And when the actual time came, saying that hell broke loose would be an understatement.
The project in itself seemed straightforward. We had spiking data from 4 neurons for 20 seconds each and 50 total repetitions for each set. Now the plan of action was to predict a Linear-Nonlinear-Poisson model on the data, similar to machine learning, but using a much more fundamental mathematical approach. And a few extra parts.
The six parts were:
- Gaussian Estimation
- Poisson Estimation
- Victor & Purpura Distance Metric
It was much of a war of nerves, with computation times increasing as the parts kept piling in. Add to that the travelling involved in the past few days, and here’s the recipe for ultimate frustration. But somehow, we pulled it off. I worked mostly in collab with Shailesh, and we somehow got it all figured out (or so we presume).
Here’s the Github repository of the entire course, including all the three projects. It also contains the problem statements and other helpful resources.
So after two nightouts (not in succession), hours and hours of coding and hours of waiting 😛 , here’s our final submission attached as a Google doc.
Hope all this effort helps me out, and maybe others too, someday!