Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I want to hear their real words. People don't need to be perfect writers. I just want to know what they really think.




If someone can express themselves well enough that their ideas are still clear, they are a competent writer.

Bad writing starts in the "wtf was that meant to say" territory, which can cause unnecessary conflicts or prolong an otherwise routine communication.

I don't like people using AI to communicate with other people either, but I understand where they come from.


And the LLM can parse out total garbage in and understand the intent of the writer? I know when I'm vague with an LLM I get junk or inappropriate output.

As an optimist I would say that it could be better at teasing out your intent from you in an interactive way, then producing something along those lines. People aren't ashamed to answer questions from AI.

The issue is that people say this, but still negatively judge people for making grammar/spelling mistakes. So, the practice will continue.

That might drift in the future. I've actually found myself leaving small errors in sometimes since it suggests that I actually wrote it. I don't use literal em-dashes -- but I often use the manual version and have been doing so much longer than mainstream LLMs have been around. I also use a lot of bulleted lists -- both of which imply LLM usage. I take my writing seriously, even when it's just an internet comment. The idea that people might think I wrote with an LLM would be insulting.

But further and to the point, spelling / grammar errors might be a boutique sign of authenticity, much like fake "hand-made" goods with intentional errors or aging added in the factory.


We've had spell check for decades, automatic grammar checking for at least a decade in most word processors.

None of this needs generative AI to pad out a half-baked idea.


Unless you are using a proprietary, dedicated grammar checker, auto grammar check is far from perfect and will miss some subject-verb agreement errors, incorrect use of idioms, or choppy flow. Particularly in professional environments where you are being evaluated, this can tank an otherwise solid piece of written work. Even online in HN comments, people will poke fun at grammar, and (while I don't have objective evidence for this) I have noticed that posts with poor grammar or misspellings tend to have less engagement overall. In a perfect world, this wouldn't matter, but it's a huge driving factor for why people use LLMs to touch up their writing.

I like your optimism.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: