The more you struggle at something the more you will learn. That works up to the point where the struggle is beyond your capacity for struggle - then you just get stuck. So ideally (assuming you are doing this because you want to learn something) you want to reduce the amount of struggle to just below your capacity.
Just copying someone else's solution, or getting an LLM to fix it for you will be very low struggle, so you won't learn much.
To add some struggle, maybe look up a solution in a different language and translate to your language? You could choose a solution in a language similar to your language, so if you are solving in C, perhaps look up a C# solution or to make it harder look up a solution in a different paradigm. Find a Haskell solution or a Prolog one and see if that gives you enough hints.
I can see the benefit of the struggle, but in relation to Day 1 Part 2 I literally had no clue why what id written wasn't working or what I needed to do to fix it.
I'm using nushell this year for AoC. It's functional by nature though you can make it imperative, so it's out of my comfort zone.
However the problem is math based whatever language you're using, involving modulo and int division. I had an hunch it was about that but no sense of what to do or how to approach.
Having looked at multiple ways of doing it I still have no clue what's going on, only that it works.
Ask for help explaining it. Check if a similar question has been asked already, but if not post your code to the subreddit and ask for help understanding why it works. The subreddit is friendly, people will answer if they see the question and can understand the code.
I've now done probably close to 100 system design interviews. One of the main things I've looked for in candidates is their ability to identify, communicate, and discuss trade-offs. The next thing on my checklist is their ability to move forward, pick an option, and defend that option. Really nimble candidates will pivot, recognizing when to change approaches because requirements have changed.
The goal here is to see if the candidate understands the domain (generic distributed systems) well enough on their own. For more senior roles I look to make sure they can then communicate that understanding to a team, and then drive consensus around some approach.
> For more senior roles I look to make sure they can then communicate that understanding to a team, and then drive consensus around some approach.
This is why I’m stuck at Senior lol. I can craft incredibly deep technical documents on why X is the preferred path, but when inevitably someone else counters with soft points like DX, I fall down. No, I don’t care that the optimal solution requires you to read documentation and understand it, instead of using whatever you’re used to. If you wanted to use that, why did you ask me to do a deep-dive into the problem?
This only seems to be in software engineering. When I was told me wanted to evaluate a new task queue service, I asked what our constraints were. I was told to survey all options and present a roundup and ignore constraints. Contrast with something like, I don't know, building a house. Do architects consider all possible material choices for a given location, or do they instead limit their consideration to materials that would be suitable to the given environment?
Making the dependencies of "it depends" explicit is the whole point.
Having built single family houses before I can tell you that architects consider only the choices they know. That is why were I live there are 1000 stick framed houses for every one built with something like SIP (structural insulated panels), or ICE (insulated concrete block) even though the others are similar cost to build with overall and have some useful advantages for the customer that often would make them a better deal for the homeowner long term.
This also means there are 100 builders in my area who only build stick frame houses and won't even talk to you if you want something else and only 1 or 2 who will even think about those other options. (they do compete with the other builders so costs are not unreasonable)
This tracks with my experience with architects (for home renovations): they use what they know, sometimes stubbornly so and even when I must point out it won't work out (e.g. materials inappropriate for the climate, inappropriate due to exposure to direct sunlight or water, obsession with visuals over practicality, etc).
I've dealt with enough architects by now to know this is the rule and not the exception.
A well implemented system would somehow allow you to use your ID to prove you have the attributes a service needs (being over 18, able to drive, no criminal records, not a communist or whatever it is they need) without providing any further information that would allow multiple services to correlate ID's against eachother.
Making confirmations of those attributes easily available will only result in more services requiring them. It's not worth the convenience for the vanishingly few cases where such a verification is actually beneficial.
The actual fines for this moving forward are up to 10% of a companies global revenue. The EU made a big point to say that this is the first time they are issuing those fines and as a result they are smaller than they otherwise would be especially in the case of repeat offenders.
Sure. They had a creation mythos like everyone else. What they didn't have is evidence of a real precursor civilization to ground those myths. The classical Greeks could see Mycenaean ruins.
Not as many as later civilizations but there are buildings that likely pre-date the Sumerian civilization like the desert kites. And in Syria and Turkey there are megaliths and ruins which are older than Sumer which builders the Sumerians might have know of from oral history.
> They believed their writing was gifted to them by the Gods.
This is an essentially universal belief in the past, and not just about writing. People are able to notice that their lifestyle depends on technologies, and that the only way to learn those technologies is for someone else to teach you. So they decide that the technologies on which their lives depend - pressing olives, farming grain, writing, harvesting wool*... - were taught to their ancestors by the gods.
In the case of writing specifically, the ancient Greeks attributed it to Cadmus, who was not personally a god. But (1) he was a hero with descent from Poseidon, (2) Greek heroes receive prayers and sacrifices and grant supernatural blessings just the same way gods do, and (3) they credited him with introducing writing from Phoenicia, not inventing it out of whole cloth.
* In early records, sheep are not yet sheared - they're plucked. The sheep we have today aren't the sheep they had then.
> "Shearing" is actually a misnomer. The Akkadian term was "plucking." Before the end of the Bronze Age, domestic sheep did not continuously grow wool, and the wool could be combed or plucked when their coats shed in the spring.
Why would you think modern sheep make a relevant comparison point after I explicitly point out that they don't?
I've been programming Rust professionally for several years now and being totally honest I don't really understand Pin that well. I get the theory, but don't really have an intuitive understanding of when I should use it.
My usage of pin essentially comes down to "I try something, and the compiler complains at me, so I pin stuff and then it compiles."
It's never been such a hurdle in the day to day coding I need to do that I've been forced to sit down and truly grok it.
The problem isn't so much the lack of enums, it's more that there is no way to do exhaustive pattern matching at compile time. I have seen production systems go down because someone added a variant to an 'enum' but failed to handle that new variant everywhere.
Yeah, it makes a huge difference whether this is the default, and that's not a Go thing, it's not a programming language thing, it's the whole of human existence.
Literacy is an example. We're used to a world where it's just normal that other humans can make marks and interpret our own marks as to meaning. A thousand years ago that would be uncommon, ten thousand years ago vanishingly rare.
I really celebrate tools which take an idea that everybody agrees is a good idea, and bake it in so that everybody just has that, rather than agreeing it's a good idea but, eh, I have other things to do right now.
And it annoys me when I see these good ideas but they're left as an option, quietly for a few people to say "Oh, that's good to see" and everybody else misses out, as if "Literacy" was an optional high school class most of your peers didn't take and now they can't fucking read or write.
Example: Git has a force push feature, necessarily we can (if we have rights) overwrite a completely unrelated branch state, given any state X, now the state is our state Y instead. This isn't the default, that part is fine... Git also has "force-with-lease". This is a much better feature. Force-with-lease says "I know the current state of this branch is X, but I want to overwrite it anyway". If we're wrong, and X is not (any longer perhaps) the current state, the push fails. But force-with-lease isn't the default.
[Edited to fix clumsy wording in the last sentence]
The overarching lesson of my career has been that people are fallible and processes that rely on humans to not make mistakes are bound to fail. Recently I've also been thinking about the importance of being able to reason "locally" about code, it might be the single most important property of a software system. "locally" typically means a function, class, or module, but I think it can also be applied to "horizontal" cases like this. For example, if you add an enum variant the compiler should guide you to reason about each place where the enum is no longer exhaustively matched.
In theory yes. But practically there are always locations where we can't match every case, so we either have to live with a warning or add a catch-all arm. And as soon as a catch-all arm exists, we are in "not checked any more" state, but with a compiler that is supposed to check for exhaustiveness. Which is way worse if the catch-all arm isn't a `panic("Unhandled match branch!")`.
Yes, you can counter that with GADTs and match against exhaustive subsets. But to ergonomically handle these cases, you need something like Pattern Synonyms or you drown in boilerplate.
Same as above: this solution stops working as soon as you're handling 5 out of 50 cases (or, more realistically, 10 out of 200). Lexical tokens are which always trigger the mentioned problems in my code - often you match against subsets, as there are _way_ too many of them to add them all explicitly.
First of all, I think there is a problem with your design if you have 50 or 200 cases. Most of the code I have seen -- in any language -- have no more than 10 cases, rarely 20. Maybe that is the issue to look at first.
Then, this "limitation" is not an argument for not running the exhaustive check. In the vast majority of cases where there are about 5 enum entries and most cases need their own path, they would be explicitly written out (i.e. no _ =>), and this works extremely well. I have had good experience, and I believe other people can attest this.
If you only have 1 case matched and everything else goes in _, later needs to add one case, you just do that, likely in every other language, there is nothing that can help. But what's described above is already a big improvement.
> this "limitation" is not an argument for not running the exhaustive check.
That's not what I wanted to express. What I wanted to say is, that even when using Haskell, which has all the possibilities to actually handle matches of subsets quite ergonomically, you can't be sure that there isn't at least one čase which isn't caught by the exhaustiveness checker (and sometimes it's just wrong, but then we're not talking about enums). So you always have to check all manually, but the checker makes that easier.
This is actually the problém. I'd even say that it works like 95% of the time, that's why people (of course, I don't make such silly mistakes ;) aren't used to check where it matters.
In reality the policy must be to always (whether there are or aren't working exhaustiveness checks) manually check each and every occurence when adding. Don't get mé wrong, I prefer having exhaustiveness checks, but they make the manually searching a bit less tedious, not alleviate it in total.
I should have added that this solution stops working as soon as you're handling 5 out of 50 cases. Lexical tokens are which always trigger the mentioned problems in my code - often you match against subsets, are there are _way_ too many of them to add them all explicitly.
But you're not adding variants without a reason? You want them to have some effect.
It's hard for me to think of an example where it would make even sense to "having to remember to handle the variant" rather than "handling the desired effect of the variant".
People are stressed, get distracted, are tired, don't have complete knowledge etc. This is kind of like arguing that null pointers aren't a problem, you "just" have to check all usage of the pointer if you make it null. In practice we know solutions like this don't work
added that every switch/if should handle this exhaustively. For any project with more than a few dozens of files, it is basically impossible to remember all the downstream code that uses the enum -- you have to track it down, or better, let compiler automate check all usages
reply