Overfitting is common-place in machine learning. But, in confining ourselves to graphical data explorations, we are creating a risk for another form of overfitting, notably archetypal overfitting. In this post, I outline why this is a problem, and how to reduce overfitting through better data analysis.
ReadUsing Rust for machine learning still has a ways to go. It's possible to overcome some of the limitations, however, by getting familiar with the lower-level libraries that drive high-performance linear algebra computing, namely, SIMD, BLAS, and Lapack.
ReadWe need a new maps app. Our maps define our cities, and well, Google and Apple just don't make the city shine. They get you from point A to point B fine, but there is so much more to urban life than that.
ReadGenerative adversarial neural networks (GANs) are all the rage. I explore if we can generate cookie recipes with GAN architectures, doing so in a way that's neither NLP nor computer vision.
ReadPie is one of the great joys of life. To all of those cake fans in the great pie vs. cake debate, I respect you, but you are wrong. Nothing else in the world can convey the same set of emotions as a delicious homemade pie. This is my years-developed recipe for blueberry pie with the best blueberries.
ReadMachine learning, the field upon which the vast majority of artificial intelligence systems depend on, has tremendous potential to do good if harnessed correctly. When used properly, algorithms can allow for better timed phone calls, and conversations directly related to a voter's interests, and hopefully, less robocalls in the middle of dinner.
ReadThe United States needs a new digital infrastructure. Looking to Estonia, and its recent approach to identification and virtual residency provides a model for America to follow. Perhaps among the greatest selling points of this program, is that the federal government need not be the administrator of it, rather this could be tested and tweaked state-by-state.
Read