With respect; a lot of us out here know and used many of those the same way; we’re silently aware of the intent. I used to be that way. Over time feeling the need to fake it fell away; now I just mock everyone through muted indifference and a shrug, “good job at being a member of social life like everyone else” kind of energy.
Emotional archetypes are limited. You have borrowed others ideas because that’s how it works; you memorized such emotional states from others. Awareness of such emotional state is not yours alone.
See. That’s how you put someone down. Directly. Not through passive aggressive southerner classics. You’re far too obvious to those who have diverse real world experience and just come off as a cliche. But we silently eye roll rather than validate such antics through feedback, good or bad.
We had conversations about going to the moon and we got there in 10 years.
Talk about producing less disposable crap, driving less, flying less, and cutting military budgets, everyone gets “exercise their 2A rights” in the US.
> Talk about producing less disposable crap, driving less, flying less, and cutting military budgets
That comment illustrates the problem pretty well. We could do all those things and we'd still be way, way behind on what climate scientists say we actually need to do, to avoid disaster.
What we need to do is reduce emissions to zero in less than thirty years, with steep declines starting now. That's not just electric grids, it's all transportation, industry, direct emissions from steel and concrete production, agriculture, everything. And then we need to start pulling massive amounts of excess CO2 back out of the atmosphere. Going to the moon was trivial by comparison; without a breakthrough in cheap fusion or something we're not likely to achieve all this.
SRM isn't a solution but it could buy us a little time, before higher temperatures get the planet releasing gigatons of CO2 and methane without any further help from us.
Cost. The "cost" of all of those things is "too high" for the average American, as evidenced by the reaction to such proposals. So, we gotta do something else.
Because even if every American stopped driving _tomorrow_, we are still going to have the effects of climate change anyway. We can't stop India and China from developing, so instead of trying to limit the rights of Americans to do something which might not even help, it's better to develop a solution such as in TFA.
The Greeks wrote in jest they hope that no one climbs to the top of Olympus, they would revolt upon finding there are no gods there.
To an extent those in the upper echelons of human society have always known the magical stories were bogus. But when the flock is dumber than Humperdoo himself, Humperdoo will do.
Pretty big jump from "I have no moral qualms with advertising" to "I am actively seeking to manipulate people into providing sexual favors to my coworkers"
Says something about a persons moral fibre if they take a bribe - but dont deliever said thing they were bribed to do. That kind of lack of character does not align with our values.
That’s what a musician does. They make short loops and loop them.
This reads like someone who knows sheet music and theory but does not listen to music. It’s repetition of short phrases over and over.
I’m not really sure what people expect of general AI trained on human generated outputs. It can’t make up anything anything “net new” only compose based upon what we feed it.
I like to think AI is just showing us how simple minded we really are and how our habit of sharing vain fairy tales about history makes us believe we’re masters of the universe.
Those models are not trained on short loops. They are trained on whole songs just like image generation models are trained on whole images. And yet they struggle to repeat sections, modulate to a different key, create bridges, intros and outros. After a few seconds of hallucinating a melodic line they simply abandon the idea and migrate to another one. There is no global structure whatsoever.
We're trying to train a full composer AI without allowing to learn about different instrument sections independently at first. The human composer will have a good idea of the different parts and know how to merge them in harmony.
I think we might get better results training separate AI systems on percussions, strings, vocals etc. then somehow create connections between them so they learn together. A band AI if you will.
We could try a BERT for each, with the generator learning to output logical sequences of sounds instead of words.
Musicians don’t spit out an album in one sitting and they’re highly trained in theory. They get bored and tired of a process and take breaks. They come up with an album of loops composed together over time.
AIs state will forever be constrained to the limits of human cognition and behavior as that’s what it’s trained on.
I read published research all year. Circular reasoning. Tautology. It’s all over PhD thesis.
There’s no “global structure” to humanity. Relativity is a bitch.
Seeing the world through the vacuum of embedded inner monologue ignores the constraints of the physical one. It’s exhausting dealing with the mentality some clean room idea we imagine in a hammock can actually exist in a universe being ripped asunder by entropy.
It’s living in memory of what we were sold; some ideal state. Very akin to religious and nation state idealism.
I think it's deeply depressing that AI has been sold as something even capable of modelling anything humans do; and quite depressing that this comment exists.
"AI" is just taking `mean()` over our choice of encodings of our choice of measurements of our selection of things we've created.
There is as much "alike humans" in patterns in tree bark.
AI is an embarrassingly dumb procedure, incapable of the most basic homology with anything any animal has ever done; us especially.
We are embedded in our environments, on which we act, and which act on us. In doing so we physically grow, mould our structure and that of our environment, and develop sensory-motor conceptualisations of the world. Everything we do, every act of the imagination or of movement of our limbs, is preconditioned-on and symptomatic-of our profound understanding of the world and how we are in it.
The idea that `mean(424,34324,223123,3424,....)` even has any revelance to us at all is quite absurd. The idea that such a thing might sound pleasant thru' a speaker, irrelevant.
This is a product of i dont know what. On the optimist side, a cultish desire to see Science produce a new utopia. On the pessimisst side, a likewise delusional desire to see Humans as dumb machines.
I lack your confidence, and find it a bit religious.
> The idea that `mean(424,34324,223123,3424,....)` even has any revelance to us at all is quite absurd.
Most of what I say to anyone is exactly this.
When I'm about to give anyone any information, I look back at all of the relevant past information that I can recall (through word and sensory association, not by logic, unless I have a recollection of an associated internal or external dialog that also used logical rules.) I multiply those by strength of recollection and similarity of situation (e.g. can I create a metaphor for the current situation from the recalled one?). I take the mean, then I share it, along with caveats about the aforementioned strength of recollection and similarity of situation.
This is what it feels like I actually do. Any of these steps can be either taken consciously or by reflex. It's not hidden.
> I think it's deeply depressing that AI has been sold as something even capable of modelling anything humans do
This is a bizarre position. All computers ever do is model things that humans do. All a computer consists of is a receptacle for placing human will that will continue to apply that will after the human is removed. They are a way of crystallizing will in a way that you can sustain it with things (like electricity) other than the particular combination of air, water, food, space, pressure, temperature, etc. that is a person. An overflow drain is a computer that models the human will. An automatic switch/regulator is the basic electrical model of human will, and a computer is just a bunch of those stitched together in a complementary way.
You're an animal. You've no idea what you do, and you're using machines as a model. Likewise, in the 16th C. it was brass cogs; and in anchient greece, air/fire/etc.
You're no more made of clay & god's breath, as you are sand and electricy.
You're an oozing, growing, malluable organic organism being physiologically dynamically shaped by your sensory-motor oozing. You're a mystery to yourself, and these self-reports, heavily coloured by the in-vogue tech are not science, they're pseudoscience.
If you want to study how animals work, you'd need to study that. Not these impoverished metaphors that mystify both machines and men. No machine has ever acquired a concept through sensory-motor action, nor used one to imagine, nor thereby planned its actions. No machine is ever at play, nor has grown its muscles to be better at-play. No machine has, therefore, learned to play the piano. No machine has thought about food, because no machine has been hungry; no machine has cared, nor been motivated to care by a harsh environment.
An inorganic mechanism is nothing at all like an animal, and an algorithm over a discrete sequence of numbers with electronic semantics, is nothing like tissue development.
What you are doing is not something you can introspect. And you arent really doing that. Rather, you've learned a "way of speaking" about machine action and are back-projecting that onto yourself. In this way, you're obliterating 95% of the things you are.
This isn't really responsive. Not only am I not using machines as any sort of model for human behavior, I'm trying to think about weird things you could do to a machine to make it ape a human.
> these self-reports, heavily coloured by the in-vogue tech are not science, they're pseudoscience.
I simply don't know what you're referring to. If you're referring to retrieving memories through associations, there's mountains of empirical evidence for that. If you're referring to wondering if I remember things, and being unsure of the information I'm recalling when I have less recall of that, or wondering if past situations compare well to current situations, well you got me. It's my personal belief that conscious thought is an epiphenomenon that is a rationalization of decisions already made.
But the rest of this is nonsense. Vivid imagery is not an argument for exceptionalism, no matter how much I say things drip or ooze. This is just association in action. You're trying to create a distinction for life (or rather what you recognize as life) life oozes and has viscera, so using a bunch of words that feel wet and organy can substitute for reason contra the robots.
Not much there to take serious their claims of being more adult than others.