Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Any arbitrary but meaningful seq2seq data can possibly be modelled by a transformer.

Language models can be trained to generate novel functioning protein structures (by training on protein functions and their corresponding sequences), bypassing any sort of folding process entirely.

https://www.nature.com/articles/s41587-022-01618-2

May as well try.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: