If possible, could I get your opinion on a specific example? In my current situation, I was asked to add a feature which required a few (java) classes. So -
* It seems like this would have been a milestone?
* So then maybe a few issues for the different classes or requirements?
* For each issue, after/during development I would note what tests are needed, maybe in the comments section of the issue? Maybe in the description?
I don't know your deployment schedule or rules. I represent milestones as groups of independent issues (bug fixes or new features) that would all go together as a release. I don't use milestones as a group of multiple issues that represent one requirement (that would be referred to as an epic). However epics are part of the paid version, however there's no reason why you couldn't use milestones this way.
If you have a requirement (doesn't matter how big or small) I'd treat that as 1 issue (regardless of how many java classes or lines of codes need modifying). If the issue is complex then within the issue's description you can use markdown (like checkboxes or bullet points) to identify subset requirements. However, if you can break that large requirement into functional changes that could exist/be deployed separately then I'd probably do multiple independent issues with some type of common identifier in the issue's name (or use your interpretation of milestones and put all those issues into 1).
If you use gitlab as your git repository then tying an issue to a merge request is easy and it would then show you the diff (aka all the changes to source code) that the issue required for implementation.
In terms of tests, same kind of answer - I don't know your rules. Every issue should have a test plan, perhaps using markdown in the issues description would convey that test plan the easiest. If you automate the test using junit then not sure the test plan is anything more than "make sure test xyz from junit passes", if it's a manual test then the issue's description can have a list of steps using markdown.
Each issue can be from a 1 line fix to a week or even 2 week thing. But by that point, probably a metaissue that corresponds to other issues or issue with inline subtasks (GitHub has check boxes).
Features cut across code, so no 1-1 mapping with classes. Tests are generally self-documenting and land alongside the feature they are for. You can document them, but likely either a comment in the issue/PR if technically interesting, or in a separate ~wiki doc as part of a broader specification.
Ideally each commit is valid & passes tests (see "conventional commits") and each issue/PR has accompanying tests whether around a new feature or bugfix. Particular test frameworks change every year.
That would be nice, and maybe I should have clarified why I asked the question. I was asked to add a new large feature, and some bugs popped up along the way. I thought better testing could have helped, and then I thought it would possibly help to list the requirements as well so I can determine which tests to write/perform. And really I thought I could have been writing those myself - PO tells me what is needed generally, I try to determine what's important from there.
Or maybe I just need to do better testing myself? There's no code reviews around here, or much of an emphasis on writing issues, or any emphasis on testing that I've noticed. So it's kind of tough figuring out what I can do
Well so the reason I asked this questions is that I did screw up a bit, and I think it could have been caught had I done sufficient testing - but I didn't because it doesn't seem to be part of the culture here, and neither are peer reviews.
So I _was_ trying to do only what was asked of me, just writing the code, but I guess I thought what I did at my previous job could have helped - which is keeping track of what was needed and then how I planned to accomplish and test.
But yeah, you've got me thinking about how or whether I should broach this topic; I think my lead is great, seems open to ideas, wants things to work well, so maybe I'll just ask what they think about how to avoid these kinds of mistakes.
As a junior dev you shouldn't be able to screw up big time, if you do, that's on the team/company, not on you. As a senior it is trickier, but usually no one should be able to screw up monumentally, if they do it's a lack of internal process, not on the individual (exceptions being malicious intents).
Changing internal processes without being a decision maker inside the company (e.g. an influencial manager/lead, the owner, a vp, etc.) is hard, even if there are clear benefits. If there of things that make no sense, there are no horizons for the improvements to come and you are not learning from your seniors, consider if it makes sense to move forward. Trying to change internal processes at reluctant employers is a common cause of immense frustration (and burnout), don't let yourself get caught into that.
> so maybe I'll just ask what they think about how to avoid these kinds of mistakes
This, 100%.
Don't tell anyone at work you asked on HackerNews and got feedback - they don't want to debate the merits of various approaches. They want it done their way, because it is obviously the right way, or else they would've modified it, right? :)
Most jobs are repetitive, so you eliminate mistakes just by doing it for a while. Hence nothing extra needs to be done, which is exactly how most people like it and why your company has no peer review or much of anything - because it just works, with the least amount of effort, somehow, someway :)
"give it a quick test and ship it out, our customers are better at finding bugs than we are" - lecture from the CEO of a company I used to work for who didn't want me to waste any time testing and didn't want to pay me to do testing. I left soon after that to find a place with a different culture, trying to change it was way too hard
* It seems like this would have been a milestone?
* So then maybe a few issues for the different classes or requirements?
* For each issue, after/during development I would note what tests are needed, maybe in the comments section of the issue? Maybe in the description?
* And then automated tests using junit?