Just dry the fungus and burn it. The process is fairly analogous to photosynthesis in plants, so we've basically already been doing this since we learned to make fire, just with a different source of radiation.
Handling nitrogen is more dangerous than CO2 for the same reason it is more humane for the animals; you don’t notice you’re being asphyxiated. Not that it is impossible or even particularly difficult to handle nitrogen safely, but it would incur a cost (training, equipment, ...), and inevitably result in accidents when these costs are skimped on.
Unless legislation changes, it's simply better business to let the animals suffer.
Yeah, and while handling radium paint on a daily basis probably isn't the best, they ingested a lot more radium than you would from just being around it all day. Radium paint had been deemed non-toxic, and so the standard operating procedure for the factory workers was to "point" the tips of the brushes using their lips.
Personally I don't mind spending several hours solving a problem over "async communication" if that means I'm free to work on other stuff while the other party is formulating a response. Then I also get the benefit of having something in their writing to refer back to.
The kind of person who takes hours to explain something in written form are unlikely to explain it in 3 minutes in person. More likely, they set up a meeting where they waffle on about an issue, expecting the receiving end to distill some valuable information from their ramblings, and then inevitably end up complaining when the solution doesn't match their expectations (which of course were never formalized anywhere).
If you take the "works" part out of the equation they seem really eager though. They regularly take a single flying thing and turn it into many flying things.
Thank you! I recently got bitten by this limitation and will make use of get_parameter_source.
I gave you credit/mentioned in the issue to restore our use of `multiple=True` behavior: https://github.com/gptme/gptme/issues/560
Yes they can. If the token you give the LLM isn't permitted to access private repos you can lie all you want, it still can't access private repos.
Of course you shouldn't give an app/action/whatever a token with too lax permissions. Especially not a user facing one. That's not in any way unique to tools based on LLMs.