What happans next?
McDonalds releases a new burger!
What happans next?
McDonalds releases a new burger!
Because some users are putting that data on Linux. So they want Linux to be killed.
They can’t change grub. But they sure as hell can convince micro$org to search for and nuke it.
Of course no idea if this happened. Just answering why they would might want to.
Cool. At the time, it was one of the best. Although, I also liked sun-os.
I also worked with VMS a lot after uni. Hated using it. But had to respect the ideals behind it.
But watching the growth of Linux has been fantastic. In 2024. It does seem to have out evolved all the others. ( Evolved, defined as developed the ability to survive by becoming so freaking useful. )
I am starting to think it is time for a micro kernel version, though.
Was a few years later for me.
Not DMU by any chance?
Late 1990s my uni had unix workstations HPUX.
So all projects etc were expected to be done on those. Linux at the time was the easy way to do it from home.
By the time I left uni in 98. I was so used to it windows was a pain in the butt.
For most of the time since I have been almost 100% linux. With just a dual boot to sort some hardware/firmware crap.
Ham radio to this day. Many products can only do updates with windows.
With the amount of open bed, cheaper printers, a lot. Keeping them inside does not prevent them from entering the environment. As well we need to breath to start with, so airflow will take it outside. Add vacuum cleaning and waste disposal. Unless the plastics are trapped and melted into larger clumps. They get into the environment. This is why they are so dangerous.
Even with enclosed printers. Unless very well filtered and some plan for disposal of that filter that prevents this. It’s just an extra delay.
Some plastic types are better than others. And I honestly think development of thermo plastic replacements is better than stopping 3d printing.
3D printing is a broad subject. Covering most materials.
It is just current home printing that is mainly plastics. Because the cost rises with other materials. Plastics allows $200 or more printers.
But it dose not have to stay bad. We are starting to see more and more research into effective plastic replacements. And the expansion of cheap 3d printing can theoretically speed up the distribution of those alternatives.
“ThE sCiEnTiStS wIlL jUsT cLeAn It Up AfTeR”
Yep I know nothing below will happan thanks to our world political motives But. .
If we charge for manufacture. By dramatically increasing the cost to use these chemicals. To fund said science. We win both battles. Reduce desire to use, while increasing investments on alternatives. And fund clean up.
The issue that layout containers are often unable to display large fonts on the widgets when certain restrictions are enforced in the layout. So, without giving time to try very large fonts. To Many developers never discover their applications just cannot be read by visually impaired people.
Unfortunately this is an issue for anyone visually impaired.
People just do not stop to think how difficult their UX is if people need very large fonts.
Likely 11k living with humans.
Just of the top of my head discovered today.
Not a GUI as one exists. But a more configurable one as it is crap for visually impaired.
Rpi-imager gui dose not take theme indications for font size etc. Worse it has no configuration to change such thing.
Making it pretty much unsuable for anyone with poor vision.
Also it varies for each visually impaired indevidual. But dark mode is essential for some of ua.
So if your looking for small projects. Youd at least make me happy;)
Nice idea, I love. But you have to remember, those investigations cost huge time and money. When you consider the cost of a full-time staff over the 10 years, you include. Plus the cost of building a case against some of the largest cooperation. All before any court costs are considered.
We are likely better off having that money reinvested in preventing other companies from these practices.
Yep pretty much but on a larger scale.
1st please do not believe the bull that there was no problem. Many folks like me were paid to fix it before it was an issue. So other than a few companies, few saw the result, not because it did not exist. But because we were warned. People make jokes about the over panic. But if that had not happened, it would hav been years to fix, not days. Because without the panic, most corporations would have ignored it. Honestly, the panic scared shareholders. So boards of directors had to get experts to confirm the systems were compliant. And so much dependent crap was found running it was insane.
But the exaggerations of planes falling out of the sky etc. Was also bull. Most systems would have failed but BSOD would be rare, but code would crash and some works with errors shutting it down cleanly, some undiscovered until a short while later. As accounting or other errors showed up.
As other have said. The issue was that since the 1960s, computers were set up to treat years as 2 digits. So had no expectation to handle 2000 other than assume it was 1900. While from the early 90s most systems were built with ways to adapt to it. Not all were, as many were only developing top layer stuff. And many libraries etc had not been checked for this issue. Huge amounts of the infra of the world’s IT ran on legacy systems. Especially in the financial sector where I worked at the time.
The internet was a fairly new thing. So often stuff had been running for decades with no one needing to change it. Or having any real knowledge of how it was coded. So folks like me were forced to hunt through code or often replace systems that were badly documented or more often not at all.
A lot of modern software development practices grew out of discovering what a fucking mess can grow if people accept an “if it ain’t broke, don’t touch it” mentality.
Very much so. But the vulnerabilities do not tend to be discovered (by developers) until an attack happens. And auto updates are generally how the spread of attacks are limited.
Open source can help slightly. Due to both good and bad actors unrelated to development seeing the code. So it is more common for alerts to hit before attacks. But far from a fix all.
But generally, time between discovery and fix is a worry for big corps. So why auto updates have been accepted with less manual intervention than was common in the past.
Not OP. But that is how it used to be done. Issue is the attacks we have seen over the years. IE ransom attacks etc. Have made corps feel they needf to fixed and update instantly to avoid attacks. So they depend on the corp they pay for the software to test roll out.
Autoupdate is a 2 edged sword. Without it, attackers etc will take advantage of delays. With it. Well today.
If the just called it other.
It would gain a huge boost in desktop usage figures.
Thanks. That was exactly what I needed. I’ll look it up.
Seems more like it is creating a successor.
A valid point. But the result is that over a pretty short period of time. These C developers will find delays in how quickly their code gets accepted into stable branches etc. So will be forced to make clear documentation into how the refactoring effects other elements calling the code. Or move on altogether.
Sorta advantageous to all and a necessary way to proceed when others are using your code.