You can add support contract requirements for some pieces of software coming from vendors with so little confidence in their product that they're rather have it run on an outdated dependencies environnement. A side effect of the logic you talked about, applied to software vendors.
Yet again an example of (for some, not-so) old, control-freak farts that just don't understand the world they live in. The law proposal is entitled "Regulation of digital space". As if a country could regulate an international network.
Sometimes I'm really ashamed of our politicians.
Unplug your mouse. Seriously. Do it. It might sound like the "kicking and screaming" method but you'll learn to rely on your keyboard even for GUI tools and you'll vastly improve how fast you navigate your computer. You should find yourself more and more in the terminal, obviously, but you may learn also some nice tricks with everything else.
Simple: because it goes against the KISS principle. The GNU tools that constitute the user interface to the system comes from a philosophy that started with Unix: simple tools, doing one thing well, communicating through "pipes" - i.e. the output of one tool is supposed to be used as the input of the next one.
Such philosophy allows to assemble complex logic and workflows with a few commands, automating a lot of mundane tasks, but also allowing to work at a large scale the same way you would work on a few files or tasks.
Graphical tools don't have such advantages:
Beware though: as time passes, Unix founding principles seems to get forgotten, and some CLI tools manifest a lack of user experience design, diverging from the usual philosophy and making the life of system administrators difficult. I've often observed this on tools coming from recent languages - python, go, rust - where the "interface" of the tools is closer to the language it's written with than the CLI uniform interface.
Nice idea! Moreso that despite software-related AI seems to be more and more opensourced, it really looks to me that it's seriously lacking on the hardware side. Opensource code running on proprietary hardware is a good start, but we'd all be better if it was open and public through and through.
Try French: same written word pronounced differently depending on its meaning - that might only deduced from context, differently written words pronounced the same - and having different meaning, obviously... and there's always poetic license if you wish to muddle things a bit more!
The example is more of an issue with using the right tool for the job: using classes inheritance and overloading would properly deal with the employee/manager/c-suite distinction.
All recent CPUs have native virtualization support, so there's close to no performance hit on VMs.
That being said, even a VM is subject to exploits and malicious code could break out of the VM down to its hypervisor.
The only secure way of running suspicious programs starts with an air-gaped machine, a cheap hdd/ssd that will go straight under the hammer as soon as testing is complete. And I'd be wondering even after that if maybe the BIOS might have been compromised.
On a lower level of paranoia and/or threat, a VM on an up-to-date hypervisor with a snapshot taken before doing anything questionable should be enough. You'd then only have to fear a zero day exploit of said hypervisor.
@steph
@lemmy.clueware.org