Hacker News new | past | comments | ask | show | jobs | submit login
Getting Started, circa 1983 (simonallardice.com)
80 points by davidbarker on April 27, 2014 | hide | past | favorite | 35 comments



I think the Author meant 1973 not 1983. In 1983 I was about to graduate from USC, we had FINE (Fine Is Not Emacs), vi, TOPS20, TENEX, TOPS10, Beehive terminals, VT52 & VT100 terminals, the infamous 'urinal' terminals (the Lear Sieglar ADM3) and really cool Heathkit H29 and later Z29 terminals. You tended to work in an 80 x 24 + 1 line status window but that was not nearly as bad as writing things out on sheets. Which I did do in 1973 because the only way to program Fortran at the High School was to send it into the district office to run as a batch job.

Nit picking aside though, the key issue was turnaround time. The time between when you thought you were done and the time you knew your code worked. Few people these days experience hitting enter at the shell prompt and waiting 30 - 45 seconds before the command started running. This was one aspect of workstations that made them so popular, no time sharing mean you could compile quickly and test and simple things like 'ls' on a directory were fast.

It got "too good" though, with people having the computer check things rather than think about what they had written. That leads to a situation where if the result is close enough to what you think it should be, you can convince yourself that it is correct before it actually is, and that can lead you to look elsewhere when things go wrong and makes debugging harder than it should be.


I was so excited when I got my own vt220 on my desk and didn't have to go into the raised floor computer room. We used a VAX 1170 then a Micro vax to compile C for Motorola 68000 and 68020 processors. Then you'd download the code to an EPROM burner.


Well, the author notes that the workplace upgraded to terminals after just a year or two. I know that the high school where my mom taught programming in Chicago still had punch card technology (no terminals) as of the early 1980s, although the upgrade happened around then. Also some of the other mentions (eg IBM clicky keyboards) are consistent with a 1983 date.


Yeah in 1983 I had a BBC Model B on my desk, could type my programs directly into the computer, and run and test them straight away.

Mind you I was 8 years old though so these were pretty simple programs:

10 PRINT "codeulike is skill!!!"

20 GOTO 10


edit-compile-test turnaround time is probably the most important factor in programmer productivity.

Waiting for a whole day to get your COBOL compiler output back after missing a '.' somewhere can get depressing really quickly.


"A year or two into this, we were finally upgraded. Amidst much rejoicing, we were provided a handful of IBM 3270 dumb terminals"

Actually, the 3270 terminal was pretty smart for its time. It was capable of x/y cursor addressing (supporting full-screen editors as opposed to line editors like the original Unix "ed"), had protected/unprotected fields (for form entry, you could only enter data in the unprotected fields and tab from one field to the next, like a form in HTML), and one version, the 3279, even had a color display. You can see a photo of a 3279 here:

https://en.wikipedia.org/wiki/IBM_3279

My first experience with computers in the 1970s was with punched cards, but unlike the author, the output (printed on fan-fold line printer paper) usually came back in a matter of minutes. When your program crashed, you'd get a hexadecimal dump of registers and memory - printed on paper.


Whenever I read about how hard it was to do things in the past — program, edit movies, typeset books, make butter — I'm amazed they even bothered! What tasks that we do today will seem shockingly tedious and hard in a few decades? Well, programming still kind of sucks...


Always great to read these stories.

My dad worked with punch cards to crunch numbers in Fortran as a young statistician. Initially the university had only one big machine. Every day at 5PM there were long lines of people delivering their decks of cards with source code to the computing center. At night clerks fed all the jobs to the machine, and at 9AM there was another line of people getting their job output back.

The output cards were stacked in boxes on the wall by user name. You could tell the height of the output decks from some distance, and if a box only contained one card, you knew instantly it was a syntax error and someone was going to have a bad day. So apparently at their start of computing, they immediately had a nice analog build status visualization.


You want to become a ProgrammiererIn (female ending), forget it, its a badly payed women job, and besides that: Men can not type!

My first job was a bit more modern, instead of men writing code with pencil and women typing it in, we had women who coded, did the admin job, and typed themself. The men moved up to become system architects or database designers. We had ASCII typewriters with paper tape to type our own code. The papertape then went to the print and copy room, where it was printed and copied on magnetic tape. Compile time was rare, especially at the dayshift. So once the printout was on the table, a group of coders, all but me women, took a red pencil to fix bugs, before a job was submitted.


In 1st professional job, in the late 80s, my first boss/lead did not know how to use a text editor. Punched cards is all he knew and the main system did not even feature a text editor (we actually compiled programs on one machine, then the compiled output was migrated over). Though there was a utility that emulated punched card input (and this was on Burroughs hardware). But he could perform a randomizing routine in his head.

Most all of the old-timers used pencil on coding sheets like displayed in this article and were aghast at us young whipper-snappers that entered code directly into the terminal.


I wonder what the next stage of this will be. Will programmeers of the future marvel that we even had to use keyboards at some point?


I started programming in 1982 and became adept at writing my code (especially machine language) out by hand before converting to hex by hand so I could key it into the monitor (this on a 6502, VIC-20). Having to do this makes you very careful: you made sure your logic was correct before you converted. You ran the code through your head multiple times to make sure it worked. Even today I rely on reading and rereading the code I write to find logic bugs before I compile and run the code. I find that kids coming out of school (yes, now that I'm old enough to be the father of current college graduate I'll call you kids) are more likely to code whatever comes to mind and then "run and pray" that it works. If you're lucky, they'll use a debugger if it doesn't work. If you're unlucky, they start putting in print statements.

I sometimes wonder if this type of programming is a side-effect of the short attention spans teens and twenty-somethings have now... but that's a subject for another time.


> If you're lucky, they'll use a debugger if it doesn't work. If you're unlucky, they start putting in print statements.

I think you got that backwards. printf debugging is a very useful but dying skill. If you have a usable debugger and you're working on one atomic single-threaded module, great -- but if you are doing low-level, asynchronous stuff, often on disparate platforms, the ability to write a log file and debug things that way is a precious resource.


I completely agree: I do a lot of printf debugging in exactly the cases you describe. As with everything though it's a tool, and not necessarily the first one to go to.


code whatever comes to mind and then "run and pray" that it works.

I've noticed this too, especially having worked with the newer students. Whenever I point out what needs to be changed, before I finish the sentence they make one of the changes (sometimes incorrectly) and then hit "build and run" in their IDE before I can stop them - and then often face a very long list of compilation errors, because I hadn't finished telling them everything that needed to be changed. I've also observed on many occasions someone trying to fix a bug by making many, many tiny changes, seemingly randomly, and attempting to compile and run between each change to "see if it works now", their code turning into a horrible mess in the process. A suitable term for this could be "IDE thrashing", since it appears to come from the "instant gratification" that they provide. Quite unproductive, and usually a sign that whoever is doing it has only a very vague idea of how their code works.

When I started using IDEs, the temptation was certainly quite strong, so I can certainly understand how those who started with one could fall into this trap. I think it takes a lot of discipline and the realisation that these frequent iterations are breaking concentration and making it more difficult to focus on the problem to avoid it. Even today most of my work is done with a text editor, a whiteboard, a pencil, and lots of paper. Whenever I face a non-obvious bug, my first reaction is not to "try something that might work", but to think carefully about the code/algorithm again and see if I didn't miss something the first time I designed it, because it tends to be just an indication that there's something further wrong.


Out of curiosity, wouldn't print statement go along with reading and rereading code methods? I switched from using debugger to mainly using print statement (and log, but they're functionally similar) because the feedback loop for using debugger is significantly longer. If you have the code in your mind and a "sense" of what could go wrong, seeing the state of program when it failed is generally sufficient for me.


Reminds me of a similar but more recent anecdote while I was in school (2005). During our lab exams, the teacher used to give us about 3/4th of the total time to write the solution to the programming question on a piece of paper, and the remaining time to type what you wrote on paper, make corrections, and compile. The programming questions were elementary like string manipulations, printing some pattern of `*` etc..

In retrospect, this approach made us to think about edge cases which now I'll think about much later in the coding process. The overall grade was based on the correctness and also on the number of alterations in logic made in the final running code.

edit: by school, I refer to senior high school as in US [0]

[0] https://en.wikipedia.org/wiki/Senior_high_school#United_Stat...


I started programming when I was 10 and I was instantly hooked. Two years later, my parents got a divorce and my dad took the computer. I only saw him on weekends, but I had to get my programming fix. During the week, I'd write Qbasic programs in a notebook, then type and test on weekends. Eventually I graduated to assembly and C. It was a great way to pass the time in high school health class.

To this day, I still prefer to do a lot of writing before I start typing, though I don't often write the entire program ahead of time. Instead, I'll write the "interesting" bits -- the stuff that's algorithmically complex or otherwise kind of tricky. I find that with nothing but a pencil and a sheet of paper, there are fewer distractions and fewer limitations (comments can readily contain drawings, for example).


In the early 80s, even though I had access to a terminal, I always wrote my code out with pencil and paper before typing it in. Don't remember why exactly. I wonder if that was just a holdover from an earlier time and how we were taught.


I used to code like that even into the 1990s when I worked at an investment bank that used BASIC (yes, old-style basic with line numbers). We had dumb terminals at our desks, but the editor capabilies were pretty limited so it was often more productive to write pencil drafts of your code until it was close to correct. Extensive pencil reworking on printed copies of the code was also common.


The furnishings of pretty much any 80's era programmer I got to observe seemed to include binders full of notes, code listings(and less often, program output) printed out and marked up, Post-its, and boxes of tapes or diskettes. A pocket calculator to work out some figures. Graph paper for working out pixel graphics, if a better tool wasn't available.

Compared with today's streamlined tools - everything cloud synced instantaneously from portable devices, and maybe a small paper notebook to sketch ideas - it seems incredibly quaint. Lots of artifacts that just aren't necessary now.


My 9 year old son was shocked the other day when I mentioned that when I started programming we didn't have "windows" or a mouse. He said "How can that even possibly be a computer?"


What's interesting is that kids who started programming in the 70s / 80s did use desktop computers. I was one of them, I started programming in 1983 with a TI99/4A. The computers I used at school had floppy disk drives, so I could even save my work. For the first few months I owned just one 128kb 5 1/4 floppy where I kept my (interpreted) programs.

I used to type in BASIC programs from magazines such as Byte. A typical bug that bit me several times was typing an l instead of a 1.


I owned just one 128kb 5 1/4 floppy

'Why would anyone ever need two?' I remember thinking.


Geez, I used to get hit with that "l" instead of a "1" all the time. It was so much nicer when some of the mags went to a fixed-width font on non-gloss paper for code.


I was lucky that one of the first issues of Compute! I got my hands on had a listing for a proofreader[1], a utility that checksummed each typed line of BASIC and displayed the checksum in the left margin. All listings in the magazine had these checksums printed to the left of the line numbers, so it was easy to catch typing errors.

This was especially useful without any kind of permanent storage. I had to type in my games every time. :-)

[1] Compute! November 1986 issue, page 94. https://ia600700.us.archive.org/6/items/1986-11-compute-maga...


That way (paper & pencil), we still coded in university (exams, tests) - still the same in 2014.

Especially you have to remember a lot of Java standard library APIs and the same for C#, ASM, C, C++.

For one exam I had to remember Lisp, Haskell, Ada, Pascal, Scheme, J# and Smalltalk language syntax and their common APIs to write code snips on paper to pass it.

What was your experience with university coding exams?


Being a somewhat "brash young whippersnapper" myself, I was aware of the punchcard scheme, but somehow it had completely slipped by me that as late as the early 80's programmers were just still using pen and paper. Somehow, I have tended to imagine the days were thats what it meant to be a programmer to have existed in the same context as Alan Turing and the Enigma machine.

I would guess those programmers made sure they had perfect penmanship, I wouldn't have wanted a typist to mistake my colons for simicolons.

I wounder if the act of physically writing out the program on pen and paper actually made it stick for viscerally in the programmers head?


> it had completely slipped by me that as late as the early 80's programmers were just still using pen and paper. Somehow, I have tended to imagine the days were thats what it meant to be a programmer to have existed in the same context as Alan Turing

seconded.

but then in another 15 or 20 years, people will probably be like, "wow, you needed a second monitor just to read documentation?" or "10 million lines of code was an acceptable size for a single application?" or something similar.

much respect & sorry about trampling the flowers, sirs.


Even when programmers had their own terminals, on a busy mainframe the queue time for compile jobs could easily be many hours. It really paid to spend time "desk checking" your code for typos and logical errors. It was faster than letting the compiler do it.


Very cool. I have a sense of the punch card era, but this is the first anecdotal write-up of an intermediary period that still relied on paper input that I've seen. Thanks for sharing!

I eat up computer history, but it's especially nice to hear firsthand accounts about seemingly mundane stuff like workflow.


If you want more, there's always this:

Old-school programming techniques you probably don't miss 11 skills and tactics that every programmer once needed to master ... and today can blissfully forget http://www.computerworld.com/s/article/9132061/Old_school_pr...


This must be why so many instructors seem to think that catching syntax errors is such an important skill.


Meanwhile, I program in an environment that highlights my line in red if I leave a syntax error in place for more than three seconds. What a different world.

(And before someone says it, yes, the programming languages have come quite far, too. Drop back to a 1970s general purpose programming language for a week if you don't believe me.)


THINK Pascal was doing this on the Mac in the mid 1980s... TP was an amazing piece of engineering given its functionality and the environment it provided.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: