It's mostly understanding text and generating text. You can do named entity extraction, question answering, summarisation, dialogue bots, information extraction from semi-structured documents such as tables and invoices, spelling correction, typing auto-suggestions, document classification and clustering, topic discovery, part of speech tagging, syntactic trees, language modelling, image description and image question answering, entailment detection (if two affirmations support one another), coreference resolution, entity linking, intent detection and slot filling, build large knowledge bases (databases of triplets subject-relation-object), spam detection, toxic message detection, ranking search results in search engines and many many more.
All of the above, it's like asking what problems can you solve with math? HuggingFace's transformers are said to be a swiss army knife for NLP. I haven't worked with them yet, but the main fundamental utility seems to be generating fixed-length vector representations of words. Word2vec started this, but the vectors have gotten much better with stuff like BERT.
There's a lot! Sentence detection, parts of speech (POS) detection to name a couple. These can be used to determine key concepts in documents that lack metadata. For example: you could cluster on common phrases to identify relationships in data.
What problems can you solve with NLP? Sentiment analysis? Semantic analysis? Translation?
What cool problems are there?