Wikipedia:Reference desk/Archives/Computing/2016 August 30
Appearance
Computing desk | ||
---|---|---|
< August 29 | << Jul | August | Sep >> | August 31 > |
Welcome to the Wikipedia Computing Reference Desk Archives |
---|
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages. |
August 30
[edit]Are people still training to become Fortran or COBOL programmers?
[edit]Or, are they legacy programmers who trained when these legacy technologies were mainstream? --Llaanngg (talk) 17:12, 30 August 2016 (UTC)
- (Anecdotal disclaimer)... but I can cite sources for my stories!
- Indeed, there are still academic and industrial training programs for both Fortran and COBOL.
- I studied FORTRAN-77, FORTRAN-90 and FORTRAN-95, and RATFOR, while I was a student. I know of several industries where these specific programming-language skills are still desired. Here is a website of one major research lab at one major American university where this software is still part of the formal training program: a software tour of SEPlib, which is still well-regarded by certain industrial sponsors.
- I have a friend who has studied COBOL informally as part of on-the-job training (in 2016!) at a major American financial institution. They work with IBM mainframes, and those still exist and are still part of the new-hire career track in certain specialized business units. IBM still advertises COBOL on z/OS.
- All this being said: if you had to decide what to specialize in - you will probably broaden your horizons by learning Java, C, and python. But if you are a serious student of computer science, you should learn a few dozen languages, and develop the specific skill-set for learning how to learn computer languages. Most programmers, at some point in their career, will have to work with some unfamiliar language, which may be a domain-specific language, a proprietary software system, or an esoteric or antique project that needs maintenance.
- If you haven't already read Learn to Program in 10 Years by Peter Norvig, ... go read it!
- Nimur (talk) 18:24, 30 August 2016 (UTC)
- "learn a few dozen languages"? Does it mean 24, 36, 48? That looks like an overkill, even for people who are really serious about computer science. Learning 4 languages in 4 main paradigms, maybe add a 5th really exotic language to them and aim for the depth - that seems like a more reasonable approach. Hofhof (talk) 22:41, 30 August 2016 (UTC)
- I was not exaggerating.
- Nimur (talk) 23:01, 30 August 2016 (UTC)
- Once you learn C/C++, you are functionally literate in dozens of languages. All I did was flip through a reference book to learn Java. I learned PHP by looking at code someone else wrote. I forced myself to learn Lisp a long time ago, so I know the extensions, such as ML. I do a lot of command-line administration, so I regularly use awk, sed, and perl. The military had be using FORTRAN and ADA. With that background, I see new languages and really need nothing more than a reference guide. In my opinion, it all comes down to learning C first. Learn C and you learn dozens of languages. Then, learn Lisp and you learn a dozen more. 209.149.113.4 (talk) 11:45, 31 August 2016 (UTC)
- Indeed, that summarizes my thoughts pretty concisely.
- I think the distinction is, some people program to draw a paycheck, and that's fine... they should develop proficiency and excellence at the most in-demand marketable language. This kind of person usually "maxxes out" at one or two programming languages.
- But some people program computers because they are inspired - they want to speak the binary language of moisture vaporators or they have a really solid affinity for working with and thinking about data. Those people will learn sed, and awk, and perl, and lisp, and they'll dump binaries to decode the machine language by hand, and ...
- Last week I had the (mis)fortune of hand-decoding a bitstream recording of PCIe link-layer and transaction-layer packets. With my trusty copy of the PCIe specification in hand, I had to write a program to turn hexadecimal numbers into information so that I could diagnose a hardware- or software- problem. I discovered that the PCIe transaction protocol was Turing complete - to experienced engineers, it's not actually very surprising, is it? For the novice reader: this means that the data link between, say, your hard-drive and your main computer is controlled by a full-fledged, fully programmable computer language. This doesn't only mean that we can reconfigure the machine for the data reads and writes: it means that we can use your PCIe link to (very inefficiently) play Pong, to execute the artificial-intelligence similator called Siri, or to run a cryptographically-secure random-number-generator (and conditionally inject those random numbers into your precious data files!) The interface is a programming language, even though most people would prefer to call it "just a bunch of bits and bytes." The program is the data! You will probably never find a textbook on "hacking the PCIe link layer to make a Turing-complete computer language." You just accidentally learn this kind of nonsense on the job!
- But because I have trained my brain to think like a machine, to grok data the way a machine groks data, it is easier for me to see emergence in places where you might not expect it. It gives me some unique insights into the realistic and practical issues on abstract topics like intelligence, machine-learning, and fundamental computer theory. When you see other programmers who get it, you connect on a level that is a lot deeper than just sharing a common syntax and dialect. Computer scientists think similarly about complexity.
- For the novice readers, here is an example of something I would call a non-trivial software program: Peter Norvig's spelling checker. It's a toy-solution to a real-world problem. It took the author about 12 hours to "solve" - in python - and you can bet that he did not spend very much time struggling with the Python syntax. Once he "solved" the problem, many of his friends translated it into dozens of languages - including weird ones like R and D. Translating wasn't difficult. Syntax wasn't difficult. Understanding that the solution is a bunch of simple probabilistic calculations was the key to the problem. Observing how different languages represent that calculation (and deal with, say, character-string syntax) is a great learning experience.
- Nimur (talk) 15:18, 31 August 2016 (UTC)
- "learn a few dozen languages"? Does it mean 24, 36, 48? That looks like an overkill, even for people who are really serious about computer science. Learning 4 languages in 4 main paradigms, maybe add a 5th really exotic language to them and aim for the depth - that seems like a more reasonable approach. Hofhof (talk) 22:41, 30 August 2016 (UTC)
- Did you go so far as to run Linux on the controller (see Spritesmods for "running Linux on a hard drive") 209.149.113.4 (talk) 16:19, 31 August 2016 (UTC)
- I like linux, and I think it's great for a lot of stuff. But around these parts, when I want to get clever and run software in places where I shouldn't, I usually prefer to boot xnu, rather than linux! But in this case, no, I did not want to try to abuse the link layer so badly - I just found it to be a fun observation! Nimur (talk) 19:27, 31 August 2016 (UTC)
- Did you go so far as to run Linux on the controller (see Spritesmods for "running Linux on a hard drive") 209.149.113.4 (talk) 16:19, 31 August 2016 (UTC)
- I regularly take contracts for FORTRAN, COBOL, and ADA jobs. Most are government, but other industries use them. I don't mind that it is old technology. I get to charge more because there is less competition. 209.149.113.4 (talk) 19:09, 30 August 2016 (UTC)
- Yes. That's what I do on a daily basis, and it is something that I learned in the last decade (and these languages have been around for much longer than that). In particular, in the field of high-performance computing, Fortran is still king. Titoxd(?!?) 19:57, 30 August 2016 (UTC)
- Fortran is still widely used in science. Intel, for instance, regularly releases new versions of Intel Fortran Compiler, which uses all features of modern processors. Ruslik_Zero 20:40, 30 August 2016 (UTC)
- The entire Department of Defense payroll is calculated on 2 mainframes running COBOL: they can't communicate, so the staff prints out data from one computer, and then types it into the other. Apparently, they've never been audited by Congress, either: http://www.npr.org/2013/07/16/202360167/investigation-reveals-a-military-payroll-rife-with-glitches.OldTimeNESter (talk) 16:28, 31 August 2016 (UTC)
- This is personal experience, but I know that a major military contractor runs its engineering simulations using legacy FORTRAN routines from the 1970s. I've also heard that FORTRAN is still used by some insurance companies, because the code is seen as reliable, and replacing it is both risky and expensive.OldTimeNESter (talk) 16:34, 31 August 2016 (UTC)
- I am inclined to disbelieve the sensational claims that were made in this now-famous news-report about the Department of Defense payroll. I suspect that the Department of Defense has no reason to disavow such claims - why should they choose to provide corrected information about sensitive computer systems? It is in their best interest that the misinformation persists. The general public, including all the news editors who work for NPR and Reuters, and all the hackers who seek to cause harm to the infrastructure, do not have any useful insight into the inner workings or implementations of their systems.
- People who need to know, like our senators and representatives who oversee the budget, and our civil servants who implement the process, almost certainly have privileged access to information that is not made public. Have a look at some search results for 'security clearance' at the website of the Government Accountability Office, or the same at the the website of the OPM.
- Our government, and our Department of Defense, both have many inefficiencies; I believe many stories and anecdotes I hear about dinosaur computers in government offices... but I do not believe for even a brief minute that an investigative reporter managed to successfully and accurately discover any meaningful technical details about how "all the payroll" for our military is handled.
- Nimur (talk) 19:39, 31 August 2016 (UTC)
- Without explicitly stating the warrant for my belief, I'm inclined to believe it's even worse than the article states.OldTimeNESter (talk) 19:47, 31 August 2016 (UTC)
- This is personal experience, but I know that a major military contractor runs its engineering simulations using legacy FORTRAN routines from the 1970s. I've also heard that FORTRAN is still used by some insurance companies, because the code is seen as reliable, and replacing it is both risky and expensive.OldTimeNESter (talk) 16:34, 31 August 2016 (UTC)