Hacker Newsnew | past | comments | ask | show | jobs | submit | orbOfOrthanc's commentslogin

Going out on a limb here to guess you live in SF :\


I live in LA and I'm mad about this exact same thing! Seattle (where I lived for awhile recently) isn't much better at all.


long island ;)


Hosting on GH; #weird-flex-but-ok

Still, I like the style


What's strange about hosting on github?


Hmmmm Wordpress...


With Wordpress you can buy a cool design in a supermarket.


And then deal with their ecosystem of plugins that all try to sell you stuff.

Busy figuring out my own site and very certain that there will be no Wordpress involved


Indeed.


Damn. You’re quite right. Excellent typography and layout even on mobile. Thanks for sharing.


Looks a little broken on my iPhone XS (13.2.3)


Same on iPad Pro with Safari; some elements cut off at the top after the animation runs.


This literally looks like LaTeX


You say that like it's a bad thing!

But the TeX math is interesting to discuss: it turns out that you can skip the usual multi-second download/parse/render/reflow workflow of MathJax JS libraries on a static website by preprocessing the final HTML pages using https://github.com/mathjax/MathJax-node . This gets you pretty much the best of all worlds: it renders instantly without JS, looks good, works cross-browser, and is dead-simple to set up as you just pipe into a tool. Definitely the best way I've found for static sites to render math.


For the flippant style of my comment, it actually wasn’t intended as an insult. I quite like your approach.


Seems to be down :/


BLUF: You can evolve mixed activation functions and layer dimensions of a neural network to vastly improve accuracy.

Many improvements, added plotting capabilities, can draw the evolution of a network over time. Also, I like .gifs so I made one of the net evolving


TensorFlow-backed Evolutionary Neural Networks.

Simulate a population of neural networks as individuals and evolve their structure to find your optimal mix.

Written in pure Python and TensorFlow. Also buggy as hell considering I wrote it in an afternoon.

Enjoy


This seems really cool. It can be pretty unintuitive sometimes which activation function(s) to choose.


Exactly. This lets you test it out for you and actually can pick up on some highly nonlinear behavior. Actually applied this myself to housing price prediction with better results than single functions on a layer.


Makes sense. Out of all of the permutations of functions, I see no reason why using the same function at every node would lead to an optimized network.

I really commend the ML people out there that have developed an intuition for network architecture and which functions to use... If there's a Feynman for machine-learning, I wanna listen to some of his lectures.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: