The Closed Question

I get a lot of shit for saying that Stack Overflow is the land of 12 year old boys, but it is the gospel truth. I don’t care how old they really are, they haven’t made it past the age of 12 mentally. Here is another shining example of why I make the statement. I stumbled across this while searching for all of the latest C++/C keywords so I could update a set of regular expressions in the Diamond text editor.

Like all meaningful questions on SO, this question was closed and will probably remain closed even after my edits. It should not shock you that the tiny handful of questions I’ve asked on SO have all been closed as well. If your question can’t be answered with a four second Google search, they will close it. After 3-4 meaningful questions you will be banned from asking anymore questions on SO.

Yeah, the children that run the place banned me from asking questions a while ago. One of the first times they shut me down was for this meaningful question. Want to know the hilarious part?

Current SO ranking

Yep, the question deemed unfit by the 12 year old boys and getting me shut down earned 1K views so far and is the result of some kind of badge. It still only got one up vote. Lots of people need to know how to do it. Most of them need to know many high minded obscure things so they never get to the 100 or whatever required for their votes to matter. Given the number of people writing code there is only a tiny tiny tiny fraction creating seed files for Ubuntu based distros.

So, since I’m sure this old meaningful question asked by someone else won’t be re-opened and my edit to the question won’t be allowed because it is over the heads of most SO regulars I will repost the information here.

The Question

This question should have never been closed. This is a standard interview question at legacy/heritage/very old shops. It isn’t Windows specific and it plagues the Unix/Linux world today. LLVM is in the process of adding support for the Pascal calling convention so it can make proper tails (among other system service) calls.

http://nondot.org/sabre/LLVMNotes/CustomCallingConventions.txt

No. The C calling convention isn’t the only calling convention in a Unix/Linux world. Lots of little systemy things got written in people’s favorite language and for many that wasn’t C.

C and Pascal evolved in their own little vacums early on. Dennis Richie was squirreled away at Bell Labs creating C and with it Unix on a DEC PDP computer. Niklaus Wirth was off in the realm of academia creating a language to teach kids about data structures.

Each language, for their own reasons, created the calling conventions they use. FORTRAN also has its own calling convention and memory organization for arrays.

Read this short article slow and several times:

This simple program is something you will see everywhere in shops with very old code bases. This is K&R era C code. Actually a touch later because with true K&R would look something like this:

int foo(s, f, b)
  char* s;
  float f;
  struct Baz * b;
{
  return 5;
}

This question is a standard because it is designed to identify a veteran developer and separate them from someone who only knows the latest standard.

C is a stack based language that doesn’t allow a developer to manipulate the program stack. As such you need to know that it relies on PostFix (also called Reverse Polish) notation. The classic example is the parsing of

3 + 4

being pushed onto the stack as

3
4
+

(using a stack that grows down visually. Yes, I know most grow up.)

You do this so you can pop them off in the order needed. First param, second param, operation.

It has been roughly 30 years since I dabbled in Pascal. Really haven’t touched it since the days of DOS and PCs that only had 2 floppy drives.

Pascal has a quasi-weird base. Many implementations are for a virtual stack or operate in pure Heap base/mode.

Heap based languages tend to use PreFix or Polish notation. (Two names, same thing) The classic examples of this use math expressions and seriously confuse the reality.

In terms of heap based languages or those following this convention parameters are populated back to front.

There are many complaints in the comments about “undefined behavior.” For someone who knows only the latest most modern standard, this would be a true statement. Someone who worked with the early compilers and took assembly programming would know that many of the better compilers of the day always passed at least one parameter to every function call. No matter what calling convention was being used, this was always the first value popped off the (virtual or physical) stack. Parameter count.

You never got “undefined behavior” with miss-matched parameter counts, you just got uninitialized values.

Why do you see zeros in the output? This was compiled in DEBUG mode.

When you compiled with DEBUG the vast majority of compilers on the majority of platforms, (MVS, VMS, Wang, DOS, Tandem, OS/2, etc.) would initialize all variables. Many would add boundary checking code for arrays and dynamically allocated chunks of memory. This lead to the lovely situation of programs working perfectly when compiled with DEBUG and failing spectacularly when compiled without DEBUG.

The intent of the question is to identify an extremely seasoned developer. Someone familiar with late 1980s to mid 1990s C and the compilers that went with it. They have an extremely large code base written during this time frame. Moving to a new standard would definitely be prohibitively expensive, if it was even possible. Some things got dropped along the way. Some “tricks” like swapping two variables without using a temporary may have seemed cute at the time, but won’t port to newer optimizing compilers because they were never a good idea. Anyone who knows the differences in said calling conventions and why they exist will be familiar with all of the problems and shortcuts that were taken during that era.

If you want a shining example of something that mostly got dropped along the way you need only look at how a single value was passed to a function requiring two and things all worked out.

The Pascal tag should also not have been removed from this. The question is a very good example of calling a Pascal library function from C on any platform and the issues that arise from it.

Which programming language offers the best job market?

I got sucked into a discussion on Reddit with this very question. No, I don’t hang out on Reddit. I wouldn’t even go there but a couple of technical writing crawlers periodically drop an email with links to things there. So, let’s get a few things out of the way up front:

  1. There is no “best language to learn” when the question is asked in a vacuum.
  2. There are many different job markets and industries which use the same programming languages, but they don’t advertise on Indeed or Dice or the general mainstream job boards.
  3. Most advertised = lowest wage.

Before you think about learning a language and beginning a career as a programmer you really need to take a course on programming logic. Many schools either don’t offer it or they completely ruin their students by teaching PASCAL in the class. The students end up learning PASCAL instead of programming logic. I understand. Kids today don’t want to flowchart and pseudo code but that is where you have to start if you are going to be any good. I wrote a book on logic years ago. Once you spend a full semester or year solving larger and larger real world problems by just drawing out the logic, you can pick up any 3GL in a few days. Honestly. You already know how to solve the problem and only need to figure out the syntax of the current language. You can jump from COBOL to BASIC to FORTRAN to C to DIBOL with relative ease because the logic and stepwise refinement are the same no matter the syntax.

Yes, logic books are thin, but you cannot skim them. You really need to sit with a small group of people drawing and writing the solutions for the exercises because a small group of people will generate different viewpoints and should ultimately provide a better solution.

“A programming job” is also a bad phrase people toss about without putting much thought into it. Ask instead “What type of life do I want?”

Do you want programming to provide you with most evenings and weekends off?

Well, you aren’t going to have that at a startup or most Web companies. That will only come from a major corporation which happens to use programs to do their real work. From there poke around to find major corporations claiming to offer that type of life for their developers. If they have Intern programs (most of the big ones do) you can inquire from the Intern contact what types of languages and skills they look for in programmers and proceed accordingly. Best if you find at least 3 which have some overlapping skill requirements. While they may have some trendy Web skills needs, most of it will be COBOL and C++ because they will have large ERP and WMS packages from vendors like Oracle and SAP. You won’t be “programming in a language” as much as “learning a package written in a language.”

Be warned that most of these companies will try to start you off somewhere between $30-60K. They have been bringing in tons of H1-B workers and paying them nothing. They will also try to place an arbitrary earnings cap on you well south of $100K unless you get a worthless MBA. Time off for friends and family has to be purchased with earnings potential.

Do you want to work 7 days per week 14-20 hours per day for years on end in some vain hope at becoming rich?

You are looking for a startup. There are several Web sites out there which focus on listing jobs exclusively with startups. I forget their names as I avoid them. Initial pay will be low because they will dangle free soda, lunches and stock options in front of you. Keep in mind the stock is worthless if the company does not go public and most startups fail. You won’t find the skills they want searching Indeed or Dice or any of the major job boards. You have to do a Web search for “startup jobs” and wind your way through a few sites. When you are 20-something startups can be cool. There is always the dream of being the next “it” thing. When you hit 30-something it’s time to take a serious look around. If you are still driving the same car you had in college or cannot afford a reliable car and insurance, you need to move into the real world. You no longer have any youth left to misspend.

Are you a closet hardware geek who really loves playing with your Raspberry Pi or Beagle Board?

Those products are great cheap learning devices for people wishing to write device drivers and assemble BSPs (Board Support Packages). For them you will be learning C, Bash shell scripting and a touch of Python. Some firmware is still written in Assembly language but not drivers. This field has too major forks, bare metal coders and driver coders. Bare metal coders have no OS. They are writing things like UBOOT and what we used to call BIOS but now is called UEFI. If you are not working with assembly you will be working with an incredibly stripped down version of C.

Do you think it would be cool to be part of a team which develops medical devices? I wrote roughly half of the user interface code for this device. That code was C++ and Qt and we ran on a highly customized Embedded Linux. You will also need to read up and learn about remote debugging. Most major Linux distros have a version of the Qt development tools in their repos. You can start by learning C++ and Qt on your desktop/laptop then learn about cross compiling for a Raspberry Pi or Beagle Board.

Despite all of the Web programming Google is famous for, they have a lot of low level OS coders. They are currently abandoning Andriod and working on Fuchsia.

Cannonical is also in the process of abandoning Linux for their own flavor of Fuschia. Don’t worry. What was Ubuntu will be rebranded Windows and shipped by Microsoft. It’s already started. When you open up a bash shell on Windows 10 it is running a flavor of Ubuntu.

Most universities can only teach you the fundamentals of software development and some of the languages needed by the largest corporations. A good many don’t even teach COBOL anymore yet it is still the language with the largest production code base on the planet and will be, pretty much forever. The Silicon Valley crowd create and discard a new “language” every few weeks. You cannot go to school to learn what they want to hire now and you cannot take more than 2 days to learn it because it will be an obsolete skill in 3 weeks.

Know this. When you search on Indeed or Dice or some other general IT job site, the languages you see the most are also the lowest paying. There has been a tidal wave of H1-B workers arriving in America. Most are Java or .Net. Despite the much talked about $60,000 threshold during the election, the bulk are paid around 1/3 of that. Yes, you can “always be employed” but, you will start off around $30K/year and 10 years later when your salary approaches $80K that will be as much as you ever earn. If you are okay with never being able to own a home in a nice neighborhood, then go for it.

Yes, you have all seen the salary surveys showing Google and others hiring programmers for around $150K right out of school, but you cannot live on that out there. Google has had to start providing housing. Other Silicon Valley firms are also buying/building living quarters for employees because you cannot get your own place anywhere near the campus for what they pay.

Speaking as someone who has written multiple books on Java, I can honestly say Java is a dying language. It never lived up to its promise and architecturally it can never be made secure. In a world where identity theft and massive data breaches pepper the daily/weekly news, that’s a big issue. No, I’m not going into a lengthy discussion about it with people who cannot yet program. You can go searching for how Java was praised in the early days because of the way it uses URLs to locate classes a program needs anywhere on the Internet. Then sit and think about that phrase.

Yes, some companies are trapped with Java and refuse to bite the bullet. They will continue using Java until they go out of business. This trapped number isn’t anywhere near the size you wish to believe. This trend started in 2010.

Now we have to talk about the definition of “best.”

If your definition of “best” is there will always be a job and you don’t care much about money, IBM Mainframe COBOL with CICS and JCL. Yes, most of the gigs will only pay $40/hr but payroll and accounting systems running COBOL will continue to run long after humans cease to exist. You can also learn Java and make even less competing with $15/day off-shore labor.

If your definition of “best” is the highest billing rate ever, then you are looking for niche tools and markets. C++ Qt on Embedded Linux or Drupal. Boring and difficult device driver coding in C for various embedded targets. BSP (Board Support Package) work. Whatever tool some startup with massive venture capital is pushing into the market this week.

Whatever is hot as a Web programming language today will be forgotten next week. That’s the way the Web is. Scripted languages come and go. Two years ago Ruby on Rails consultants were billing out at over $200/hr and most are unemployable now. Scripted languages have no staying power in the market because there is no commitment to them. You don’t develop a high volume core business system in a scripted language because the processor cost of the interpretation is simply too high.

Think about it. ADP is one of the biggest payroll processors in America. Do you really think they could chew through that many payroll records each month using Java or some other scripted language? Keep in mind all of the precision problems with floating point types. Packed Decimal and BCD (Binary Coded Decimal) aren’t just legacy data types to save storage space. They also existed to solve precision problems which crop up with payroll, mortgage and other financial calculations. When you use IEEE floating point types:

1.0 - 1.0 != 0.0

Most of the time. You can search the Internet for all kinds of war stories when trying to do integer math with floating point data types. This is a problem made worse by JavaScript and other newer scripting languages which have a single numeric data type, double precision floating point. You will find more than one young programmer posting pleas for help because 3 – 2 did not equal 1.