Posted inInformation Technology

Why Reddit Sucks

reddit error message

I try to stay off social media because social media is human poison. From time to time I will go to Reddit for technical discussions. You would think that, given Reddit’s income, they would hire actual developers and have a physical QA test team, wouldn’t you? Guess not. In case you cannot see the featured image

reddit error message

I have posted on Reddit before. I spent 3 hours creating a very detailed response to this question. Tried multiple times from multiple browsers. Took all of the links out. Constantly received same useless error message without any indication of why it was happening.

The Reddit Response

Reddit users won’t see it, but you will.

As someone with roughly 40 (possibly more) years in IT having written software for everything from real computers with real operating systems before we had the Internet, to desktops running DOS, OS/2, Windows, and Linux and spending the past decade creating embedded systems for mostly medical devices with my own Yocto builds of Linux for the target I will try to answer your query.


Linux was birthed in Anarchy.

Prior to TCP/IP being anywhere close to a network standard the real computers with real operating systems had proprietary everything. Every company wanted to be the “industry standard” which all other vendors had to license. You had to buy everything and it was expensive. The flip side of that was you had exactly one tie to grab when things didn’t work. Life was expensive, but it was stable. DEC ported VMS from the 32-bit VAX (1980s) to the 64-bit Alpha (1990s) and other than having to VEST a few binaries/object libraries nobody has source for anymore, customers noticed almost nothing. It was much the same when moving to the failed Itanium processor.

Those who used IBM had a similar journey. Even the “new” OS/F is mostly a new version of MVS.
When you are charging millions of dollars for systems and customers are using them to process billions of dollars in transactions each year, stability is your primary DNA component. You use actual real Software Engineering, not Agile. You have an actual QA testing department, not TDD. You push nothing out the door until you’ve both stress tested it and tried it in the DEV systems for some major clients.

“Free” weekly trade magazines combined with oceans of marketing fraud tried to make a case for the “little” computers. They started touting “Open” things even though almost every version of Unix was commercial at the time. They even called Windows an Open operating system even though it was (and still is) 100% proprietary.

The first big battle line during the days of SUN Microsystems was RISC (Reduced Instruction Set Computers). The Mainframe and Midrange computers all used proprietary CISC (Complex Instruction Set Computers). The “free” weekly trade magazines and marketing fraud got MBAs to focus on MHz and clock cycles instead of overall throughput. Yes some of those CISC instructions took 3-8 clock cycles to complete, but it too 40 RISC instructions to do the same thing. That last part got lost in publications.

Keep in mind UNIX in all its names was proprietary. Bell Labs released the code as part of the AT&T breakup, but everyone sold a commercial flavor of it. TCP/IP was not yet a networking standard. Text files were, and still are, a joke. IBM has EBCDIC and I forget what for line endings. Typically DEC had but could be almost anything because FORTRAN, forget which version, use just . Windows and a few other vendors used , UNIX flavors used just . Adding insult to injury DEC and IBM platforms understood RECORDS. The little computers, to this day, have no concept of them at the OS level. This makes true clustering impossible on the little computers. You can only have fake clustering.


Enter the desktop.


MS took all IBM’s money “working on” OS/2 to create Windows. Prior to Windows NT, Windows was never and operating system, despite the fraudulent marketing. It was a task switching GUI layered on top of DOS. Presentation Manager was the task switching GUI layered on top of OS/2. IBM had the ethics to not call Presentation Manager an operating system, Microsoft did not have the same level of ethics with Windows.
Microsoft tried to come up with their own desktop navigation “standard.” IBM published CUA and the world followed. CUA only seemed foreign to people who came from the world of the little computers. It was exactly how CICS applications worked on 3270 terminals. They just added some mouse functions.

The “free” weekly trade rags got more advertising dollars from Microsoft and the little computer vendors than they did DEC or IBM, so they emptied their colons on OS/2. Microsoft pulled out of the joint venture. IBM killed off OS/2 and the world of software on the little computers was set back at least 30 years. OS/2 had waaaaay better memory management than Windows. It was dramatically more stable and it was developed with real computer mentality. Take a look at the screen shot of IBM Lotus SmartSuite.

It existed for both OS/2 and Windows. Everything was integrated. You could share documents between operating systems without them getting trashed. A user-friendly easy to understand desk drawer file cabinet UI. You kids today may not be impressed, but that desktop design was ground breaking when it shipped.

The Linux anarchy started around this time. You could download it via dial-up from a BBS or buy 2 floppies from someone. It was command line only. There were so many people who couldn’t get along with Linus that hundreds of “flavors” were created. BSD isn’t Linux, if memory serves, it is the last refuge of AT&T Linux.

Trench warfare was rampant. The commercial UNIX products were dumped on by professionals. Corporations forced DEC and IBM to come out with their own proprietary Unix flavors. I think DEC had two, Ultrix and something else. I forget what IBM’s was. Until DEC came out with the VAXStation and DECWindows (based on Motif) the world of real computers didn’t have graphics. That was for the little computers.

Keep in mind, at this time most disk drives were measured in MB. Eventually I had a 1GB drive in a VAXStation, but it was insanely priced. I seem to remember it was around a grand. Yes, having pictures of parts instead of just numbers and text would have been nice, but we couldn’t store it.

To help grease the skis for sales, most Unix flavors claimed to provide a DEC VT-100 terminal software package. Even today your Linux flavors claim their terminal software operates in VT-100 mode. It doesn’t. Get yourself a free account on Eisner, log in and try to use EDT keypad navigation. If you want to try out a Linux editor with EDT keypad navigation pulled down an AppImage of RedDiamond.

Throwing up a big fat middle finger to the DEC world, while claiming VT-100 compatibility, the Unix/Linux world deliberately hid the NumLock key in the standard terminal definition. One has to hack the terminal definition to make it usable when in VT-100 mode connected to the operating system designed to use VT-100 terminals from its birth.

Once Qt came out a group of egos got together and created the K desktop. You know it as KDE. It was an attempt by the flaming egos of the Linux world to create something that almost looked professional. Windows and OS/2, heck even DECWindows users had an OS desktop with a central clipboard. They could run multiple applications interactively at the same time. Cut & Paste between them. Linux could do none of that. Oh, there was some Motif stuff and quite a few graphical applications, but not an OS desktop.

KDE was buggy. It quickly became bloated. The database used for the PIM used to fall over more than a toddler learning to walk. But it was something. The SuSE world created YaST which was universally hated by all except devout SuSE believers. The email client was and still is ugly. Most of the distros claiming to provide a KDE desktop don’t actually install most of the KDE applications. Abandonware is rampant in the KDE community.

You need to know that much of the history to get to the heart of the matter.

Neither science nor human language has come up with a proper scale or tool to measure developer ego. Stories of Linus Torvalds unwashed rectal sphincter personality are legendary, but only a complete asshole could have made Linux a success. Without such a Titanium grip backed up by a flame thrower and heavy artillery the Linux kernel wouldn’t be any better than the Linux desktops. If you want to see what the bottom is try Canonical’s Unity desktop, the most hated desktop ever.

IBM had a corporate structure. They had “Efficiency Experts” later called Systems Analysts, now some are called Usability Experts and UX designers. IBM used actual Software Engineering for OS/2. They created an architecture for the product. Defined how memory would be managed and how processes would communicate and possibly created the first successful desktop clipboard. Presentation Manager had to be painted onto the screen given a rigid set of rules. IBM designed the desktop for business users because that’s who paid for things.

Microsoft was and still is a really poor software company. They improved dramatically after stealing design knowledge from IBM while also stealing their money. Windows 3.1 and Windows for WorkGroups were the best Microsoft could do given their inability to develop quality software. DOS had a 640K memory limit don’t forget. Windows for WorkGroups was the first time Microsoft made any valid attempt at understanding the “business user.” They hired Dave Cutler, one of the engineers behind the creation of DEC VMS.

Cutler used actual Software Engineering, not Agile. He created an architecture for how memory was to be managed and how applications could communicate with each other as well as system services. The version he developed for the DEC Alpha was oceans better than the version Microsoft shipped for x86. It really was a GUI VMS. That’s why it was WNT. One letter up each for VMS.

Both IBM and Microsoft had corporate structures. The developers were employees or contractors. Upper management controlled what it looked like and what would ship. IBM ran full QA on their PC software just like their mainframe software. I think we can all agree Microsoft does significantly less in that area. If your developer ego couldn’t take not being “in charge” or not doing it your way or management not including your feature, or whatever, today’s your last day; grab your shit and go home. Upper management would define what it did and didn’t do. They would also control the release.

When you expect people to pay for something you can’t ship garbage.

Contrast that with the Linux world


In the world of real computers the OS provided a robust accounting system. Every byte of RAM, block of disk, and CPU clock cycle any process/user/group/batch job/etc. was kept track of. When your running task ended, either successfully or via crash, the OS swept up behind you. Everything was freed. Not so much here. If some little piece of the desktop crashes it could have allocated some scarce resource that the non-existent process will own until you reboot. The real computers were charging millions of dollars and their customers were going to “lease time” to other entities to help pay for the things. Linux was designed with a “just make it work with this single core computer” mindset. Understandable for a personal project, but now you can’t retrofit a bullet proof accounting system without gutting a big chunk of the known universe.

Rather than a rigidly defined architecture on how things are supposed to communicate and work, we have a bushel basket: dbus, memory pipe, memory socket, etc. Well . . . you hope. Dbus doesn’t exist everywhere and some distros shut down other methods in the name of security.

Most of the real computers don’t do GUI anymore. Having said that, when you are developing for Windows or possibly even Apple you have the one rigid display API/engine to deal with. On Linux it might be X11, Arcan, Mir, TinyX or . . .


https://news.ycombinator.com/item?id=37899272


The X11 emulation for Wayland, having dealt with it on my last medical device, is a . . . long way from good. Yet another big fat middle finger to standards. What is really honking off the embedded systems world is they, whoever they are, didn’t leave X11 in for Yocto builds. We don’t use desktops. We don’t have the X11 “security issues” because we don’t build most of the stuff in that allows those things to happen and all our tools work with X11. Our tools don’t work with the X11 emulation for Wayland.

Many/most/possibly all Linux desktops had to have some changes/features done to X11 over the years. When X11 gets changed for one desktop it is rarely tested with the others and guess what? They are now broke in some fashion.

Corporate entities making decision about Linux’s direction are chasing the gamer market, abandoning the basic desktop and embedded systems market. X11 had basically a single threaded server that rendered. For embedded systems with a touch screen this is more than enough. For your boring HP Small Form Factor corporate desktop with an i5-ge3 or i7-gen4 (still a ton of them on desks) its fine. It won’t let you utilize the zillion CUDA core on your NVidia card and it isn’t fine for 4K video streaming, intense gaming, etc. Gamers spend a lot of money. What works well for gamers doesn’t work well for a standard desktop where you need to check email, edit a spreadsheet, and write the great American novel.

Most infuriating X11 emulation quirk found with last medical device project was you could not control screen position for an application. That thing so commonplace as starting an app and having it remember where it was and how big it was could not be achieved.

Any ego kicked out of a project can take the source code, change the branding, and start their own distro. This is true for distros as well. Most developers only care about one or two things so that is all they test, if they test anything. Too many developers and Linux distros blame NVidia for their problems. My machine was working before you pushed out an update that obviously was never tested with NVidia drivers because now I have a black screen!

Trouble is, most of those egos don’t know how to design anything. Canonical is seriously guilty here. Upstart – their replacement for Systemd; massive failure and bitch to work around for “one Debian to install on them all.” You can read more about their failures here

From a developer perspective, most of the cross platform toolkits aren’t tested with Wayland, Mir, Arcan, TinyX, etc. So from an application on your desktop perspective, odds of them working with the lesser known stuff are small. 15 years is an LTS, not 5

One of the reasons Windows succeeded in corporate America is the fact they got distracted. XP came out and it was over 5 years before the catastrophic Vista came out. Despite support supposedly ending for it, there are still hospitals running Windows XP.

Lastly, Agile is ___NOT Software Engineering. You don’t just push out the result of the last Sprint when using Software Engineering. Customers aren’t your Alpha testers. Consider the XP comment. Despite Windows itself being stable as a soap bubble for much of its existence, the XP desktop had an architecture behind it and enough years to get polished. It is impossible to create a solid desktop for an OS that every week or two pushes out updates that only went through TDD automation. Unless you have a room full of people testing on all kinds of hardware, you can’t be sure it works.

Not one major corporation I’ve ever worked for allows their desktop/laptop computers to reach out to Microsoft for updates. The support team hosts their own update server. This is why you have a standard corporate desktop with standard apps. The support team has one sacrificial corporate desktop they manually pull updates down to, test with all applications, then move “safe” updates to the corporate update server.

Summary

In the commercial world they use real Software Engineering, not Agile. There is a hierarchy that decides the fundamental architecture of the product. These same people also choose the Look & Feel based on information from research departments and scientists. Yes, scientists. For the non-gui world of data entry screens we had to keep a typists fingers over home row. All numeric entry was always grouped together so the clerks hand could remain over the numeric keypad. There were tools to measure the efficiency of your form. Mostly people who are paid to work on the product and that product is the source of the revenue which pays them. As such, having a team of highly qualified manual testers verifying the product prior to release is massively important.

In the Linux world most distros are using Agile and Sprints. This does not make for a solid stable product. Updates are automatically pushed out and it is almost physically impossible to turn off automatic updates.

Even if your system was working perfectly, odds of black screen or some other issue on reboot are significantly greater than zero. Remember Ubuntu 13? ISO booted great, installed flawlessly, pulled down all updates, on restart, if your network device used Broadcom chips, it couldn’t be found. Stories like this are common in the Linux world.

There is no well defined and solidly implemented application architecture. Admittedly this has gotten better. Most desktops now at least have some GUI tool that will let you assign default applications and keyboard shortcuts. Sadly most desktops don’t fully conform to CUA. If you don’t believe that look up the DELETE-LINE keyboard shortcut for Gnome, KDE, Unity, Mate, Budgie, etc. These variations cause desktop applications that use one or more of the now conscripted keyboard shortcuts to behave differently on different desktops.

Given the underlying window managers, rendering is always just a little off across platforms.

Probably the most significant reason though, is that most developers aren’t getting paid to work on Linux. Yes, Canonical, RedHat, and a few others hire developers to work on their COMMERCIAL versions of Linux, but much of the development occurring in the free-for-the-general-public realm and if you don’t like my feature, go (^(&^(*&^

Roland Hughes started his IT career in the early 1980s. He quickly became a consultant and president of Logikal Solutions, a software consulting firm specializing in OpenVMS application and C++/Qt touchscreen/embedded Linux development. Early in his career he became involved in what is now called cross platform development. Given the dearth of useful books on the subject he ventured into the world of professional author in 1995 writing the first of the "Zinc It!" book series for John Gordon Burke Publisher, Inc.

A decade later he released a massive (nearly 800 pages) tome "The Minimum You Need to Know to Be an OpenVMS Application Developer" which tried to encapsulate the essential skills gained over what was nearly a 20 year career at that point. From there "The Minimum You Need to Know" book series was born.

Three years later he wrote his first novel "Infinite Exposure" which got much notice from people involved in the banking and financial security worlds. Some of the attacks predicted in that book have since come to pass. While it was not originally intended to be a trilogy, it became the first book of "The Earth That Was" trilogy:
Infinite Exposure
Lesedi - The Greatest Lie Ever Told
John Smith - Last Known Survivor of the Microsoft Wars

When he is not consulting Roland Hughes posts about technology and sometimes politics on his blog. He also has regularly scheduled Sunday posts appearing on the Interesting Authors blog.