We recently published 2 new ML/neuroscience research projects as part of the Request for Research (RFRs) projects, with WBAI. They’re fascinating topics that have arisen through the relationship with our advisor Elkhonon Goldberg from the Luria Neuroscience Institute.
It’s such a joy to be able to test an idea, go straight to the idea without wrestling with the tools. We recently developed an experimental setup which, so far, looks like it will do just that. I’m excited about it and hope it can help you too, so here it is. We’ll go through the why we created another framework, and how each module in the experiment setup works.
This is the second part of our comparison between convolutional competitive learning and convolutional or fully-connected sparse autoencoders. To understand our motivation for this comparison,… Read More »Convolutional Competitive Learning vs. Sparse Autoencoders (2/2)
Competitive learning is a branch of unsupervised learning that was popular a long, long time ago in the 1990s. Older readers may remember – the days… Read More »Convolutional Competitive Learning vs. Sparse Autoencoders (1/2)
Eager Execution is an imperative, object oriented and more Pythonic way of using TensorFlow. It is a flexible machine learning platform for research and experimentation where operations are immediately evaluated and return concrete values, instead of constructing a computational graph that is executed later.
We’ve just uploaded a spin-off research paper to arXiv titled “Sparse Unsupervised Capsules Generalize Better”. So what’s it all about? Capsules Networks You may have… Read More »Sparse Unsupervised Capsules Generalize Better
The dataset is an integral part of an ML engineer’s toolkit. We recently compiled useful information about a range of these well known datasets. It’s all in one place, and hopefully useful to others as well.
ML Today Today’s Machine Learning has demonstrated unprecedented performance in what seems like every application thrown at it. Almost all the success has been based… Read More »The case for Episodic Memory in Machine Learning