Hacker Newsnew | past | comments | ask | show | jobs | submit | sl8r's commentslogin

I left the TENET references on the cutting room floor.

I too found it really surprising that the reverse-time equation has a simple closed form. Like, surely breaking a glass is easier than unbreaking it? That’s part of what got me interested in this stuff in the first place!

If you haven’t seen it yet, highly recommend the blogs of Sander Dieleman & Yang Song (who co-invented the SDE interpretation).


The production of tears is left as an exercise to the reader. /s

Thanks for reading. The 2D simulation section might be more interesting on a first read — it makes the math less mysterious, I hope!


Hey, I’m the post author — thanks for reading.

Theta represents all the model params — all the weights in the neural network. The convention is to write theta for the “learned” score function and omit theta for the “true” score function.


Hey! I’m the post author. Highly recommend Sander Dieleman’s blog for alternative interpretations https://sander.ai/2023/07/20/perspectives.html

I personally find the SDEs the most intuitive, and the deterministic ODE / consistency models / rectified flow stuff as ideas that are easier to understand after the SDEs. But not everyone agrees!


Thanks for sharing this! I tend to agree that it’s easiest to understand this way.

I just find it a frustrating fact about modern machine learning that in fact, the nice SDE interpretation is somehow not the “truth”, it’s just a tool for understanding.


> This has been the promise over and over again, for centuries, and it has consistently not paid off. Where's the predicted society where automation allows us all to work for two hours a day, and spend the rest at leisure?

In the “Sad Irons” chapter of Caro’s LBJ biography, he talks about the pre-electrification lives of Texas farmers. In comparison with that, our whole day is leisure.

Similarly: As late as 1900, the poor in Europe were so severely malnourished that growth stunting was common. Look at Our World in Data’s charts of height over time. Or Robert Fogel’s “The Escape from Hunger and Premature Death.”

Etc. Etc.



what major institution failed in the early stages of [1] the japan asset bubble or [2] the dot com bubble? Or farther back, the 1840s railroad mania or the south sea bubble?

credit bubbles often pop when some institution can't cover its obligations, but asset bubbles don't seem to need such a failure — and can deflate on their own.


No one ever could have predicted those things! Except for the large number of documented cases of people and institutions who did...


I'm not saying that crises aren't predictable — I'm pointing out that institutional failure isn't always a leading indicator (as in OP's argument).


sorry if it wasn't clear, I completely agree with you. I just feel like people make that argument in response to what you are saying and it's not a very good one


Interestingly, they’re actually used all the time in LBO models. Because the default is to sweep all FCF to pay down debt, but then the interest expense is dependent on FCF, which depends on the interest expense… Sort of a trivial example b/c you could solve it by being more granular with time periods, but in practice people just use the circular ref.


I've also seen it basterdize into running a loop for a monte carlo sim instead of using a VBA macro. Genius and stupid all at the same time.


I made a streamlit app about Kelly last year, showing how to bet when you have an "edge" over a toy market of coin flippers: https://kelly-streamlit.herokuapp.com/

Other references I found interesting:

  - Cover and Thomas's "Elements of Information Theory" shows some interesting connections between Kelly betting and optimal message encoding.
  - Ed Thorp, the inventor of card counting, has a nice compendium of papers on this in "The Kelly Capital Growth Investment Criterion".


If you're interested in this, check out Cover's Information Theory textbook — the rabbit hole goes much deeper. One of the most interesting examples, is that when you're betting on a random event, Shannon entropy tells you how much to bet & how quickly you can compound your wealth. Cover covers (heh) this, and the original paper is Kelly: http://www.herrold.com/brokerage/kelly.pdf


Kelly's paper (based on this paper by Shannon), is responsible for fundamentally reshaping equity, commodity and even sports betting markets.

I highly recommend William Poundstone's book, "Fortune's Formula" as a biography of those ideas - it's almost as good as any Michael Lewis book on the subject would be.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: