Sometimes? If you have enough altitude to trade for speed then after the cutoff you could glide to a hypothetical miraculously-placed runway right in front of you, vs. having fire quickly consume the entire plane if you don't cutoff..
It's not that rare, and there are institutional factors (such as seeking treatment for psychosis being career-ending for a pilot) that incentivize serious pilot mental health crises being untreated.
> There is no possible way to confuse these two actions.
This is obviously an overstatement. Any two regularly performed actions can be confused. Sometimes (when tired or distracted) I've walked into my bathroom intending to shave, but mistakenly brushed my teeth and left. My toothbrush and razor are not similar in function or placement.
That's just your brain associating the bathroom with the act of brushing your teeth, and therefore doing it automatically upon the trigger of entering the bathroom. It bears no resemblance to the accidental activation of a completely different button.
The other poster's correction: "it’s like brushing your teeth with razor" is apt. Touching the fuel cutoff switches is not part of any procedure remotely relevant to the takeoff, so there's no trigger present that would prompt the automatic behavior.
Good analogy. Things I do every day in front of the mirror, but I occasionally attempt to squeeze some soap on my toothbrush. Or I have to brush my teeth and I find my beard foamed up. Or I walk out of the shower after only rinsing myself with water.
Not a bathroom one, but the number of times I've tried to pay for public transport with my work/office fob is mental. Generally happens on days where I'm feeling sharper than average but also consumed with problem solving
If someone confused their steering wheel for the brake you'd probably be surprised - there are indeed errors that are essentially impossible for a competent person to make by mistake. No idea about the plane controls, though.
Even in modern "fly by wire" cars the steering wheel and brake pedal have an immediate effect. They are essentially directly connect to their respective control mechanisms. As far as I understand both of the plane controls on question just trigger sequences that are carried out automatically. So it's more like firing off the wrong backup script than scratching the wrong armpit.
The only two production cars on sale where the steering wheel is mechanically decoupled from the wheels are the cybertruck and a variant of the Lexus RX.
Essentially impossible is not the same as impossible. We already know that an improbably sequence of events took place because a plane crashed which is highly unusual.
Technically an overstatement but not by much. Correctly restated, its highly unlikely these actions were confusing pilots. It's as if you mistook flushing your toilet twice when instead you wanted to turn on the lights in your bathroom.
I don't agree with the "twice". A frequently performed manipulation like the fuel cutoff (usually performed after landing) collapses down to a single intention that is carried out by muscle memory, not two consciously selected actions.
Not really, though. They're both (retracting the gear, and cutting off fuel) just toggle switches, as far as your brain's conscious mechanisms go. Doing them both on every flight dulls the part of your brain that cares about how they feel different to perform.
(I'm not strongly arguing against the murder scenario, just against the idea that it's impossible for it to be the confusion scenario.)
I meant philosophical toggle switches, not physical ones. The gear can go between down and up. The fuel can go between run and cutoff. Given enough practice, the brain takes care of the physical actions that manipulate those philosophical toggles without conscious thought about performing them.
Genuinely curious - could heavy marijuana use cause confusion between landing gear and fuel cutoff? Or some other drugs? (Wondering if they screen pilots for alcohol before they board an aircraft.)
The issue is that anything put into an LLM thread can alter the behavior of the LLM thread in significant ways (prompt injection) leading to RCE or data exfiltration if certain scenarios are met.
I don't know that ChatGPT's voice mode is using audio as a transformer input directly.
It could just be using speech to text (e.g. Whisper) on your input, and then using its text model on the text of your words. Or has OpenAI said that they aren't doing this?
OpenAI does not provide many details about their models these days but they do mention that the "Advanced voice" within ChatGPT operates on audio input directly:
> Advanced voice uses natively multimodal models, such as GPT-4o, which means that it directly “hears” and generates audio, providing for more natural, real-time conversations that pick up on non-verbal cues, such as the speed you’re talking, and can respond with emotion.
Then the malware would provide that confirmation to the wallet too. Defending yourself from malware running on the same (Windows) machine is mostly impossible.
>An app reading from clipboard must ring all sorts of alarms. Let alone writing to it.
You realize any sort of content editing app is going to be reading from clipboard? Most apps used on a daily basis are going to be reading from clipboard.
You should try AI sometime. It's quite good, and can do things (like "analyze these 10000 functions and summarize what you found out about how this binary works, including adding comments everywhere) that individual humans do not scale to.