That's a good idea to tackle next. The project is only a few days old so I haven't added this yet. Will look into it! It does have memory context in the sense that as you chat, it will remember your messages, but there's no support for pre-populating the context from a document (though you could drop a bunch of content into the warmup prompt right now).
We're thrilled to announce the launch of AST-1, an innovative language model that uses an advanced attention mechanism to produce coherent and contextually relevant text while efficiently handling computational and memory requirements.
The key to AST-1's success is its novel RNN attention mechanism. Unlike traditional methods, RNN attention lets the model dynamically focus on different parts of the input sequence, resulting in more natural-sounding text and the ability to handle long input sequences.
AST-1's architecture is carefully designed for optimal performance and the model parameters are fine-tuned for efficiency and accuracy. Plus, its training process enables it to handle a range of natural language tasks, from sentiment analysis to machine translation.
We believe that AST-1 represents a significant milestone in language model development and invite you to try it out for yourself. Discover the power of AST-1 today and let us know your thoughts!