Hacker News new | past | comments | ask | show | jobs | submit login
Where I started might not be useful to you (rachelbythebay.com)
152 points by zdw on April 5, 2018 | hide | past | favorite | 100 comments



I frequently run into the same problem. I work with clients, and inevitably one of their employees asks something like "how did you get to be so good at Linux? You figured out a problem we've been chasing for weeks!"

And I hate my answer, especially when the employee is around my age or older: I first installed Slackware from floppy disks when I was about 12 years old. I installed it because I heard that Linux had a great C compiler, and I was getting tired of the Power-C compiler that I'd been using for 2 years.

Like Rachel, I started on a Vic-20, around age 8. The computer was at my dad's house though, and I lived with my mom most of the week, so I would take out books from the library, write all of my code on loose leaf, and then fast & furiously type it in when I got over to dad's house to see if it worked. POKE 36879 indeed.


I sometimes wonder if growing up before the internet and Wikipedia and Stack Overflow was actually a boon for my problem solving skills. I spent so many hours with nothing to go on but some books and my own persistent trial-and-error. If I'd done the same stuff these days I don't think I'd have learned half as much (although would I have done more, different things? Probably...)


I think there's two really important points here, and they're only loosely connected to each other.

1. Youthful trial and error is really powerful. I had access to a 286 during my formative years, along with the manual for DOS 4.0. I sat down with that manual and typed in every command to see what it did, until I got to one (I think it was RECOVER?) that overwrote all 30 Mb of the hard drive with garbage files. Then I had to learn how to install an OS. After that, I stopped entering random commands, but fortunately I'd already discovered BASICA and EDLIN.

2. Being able to hold a program in your brain and write it down in a notebook is a huge advantage. I've done all of my best work with pen and paper.


Re #2, I'm not a programmer, but as a writer I find the same thing to be true. I work better with a pen and legal pad than a keyboard.

I've got this feeling that we have neural pathways built for typing and, subconsciously, we favor phrases, words, and sentences that are easy to type. I know I even have a tic where if I pause to think I'll insert extraneous commas.

It's kind of like how musicians tend to improvise around licks and scales that they have practiced a lot. There's something to thinking about the idea of a song on paper that is more creative than unintentionally falling back on chords/forms that are easy to play. (Adam Neely discusses this very well here: https://www.youtube.com/watch?v=6c_LeIXrzAk)


Wouldn't the same be try of a pen and paper, though? We would be subconsciously favoring words that are easy to write?

I have horrible handwriting and am a VERY slow writer (and my hand gets cramped after writing a few words). I know whenever I have to write something by hand, I tend to choose small words and short sentences. On the other hand, I am a very fast typist and can type faster than I can think in most cases, so I tend to be much more verbose.

Yes, the medium we use matters.


It is a valid point, for sure. My handwriting is nice and easy for me (a point of pride, actually), so I can definitely see being biased toward it.

I can definitely type faster than I think, though. So my hands always end up in this kind fo wait() loop while my brain processes. Sometimes, I look back at my previous sentences and realized that my hands kept going for a while before by brain even bothered to stop them. I guess I don't have that problem as much with pens.


I think we're inclined to favor whatever abstraction we grew up with, and nerds are slightly more likely to favor the next one.


I think part of the advantage of paper is that it rewards you for making up useful abstractions and inventing notation on the fly, so that you don't have to write things out. On paper, you don't get red squiggles if you're using something that's not properly defined elsewhere, and there's no penalty for using a symbol that's compact but hard to type.


RECOVER - worst named command ever. It (AFAIR) was designed to recover data from a failing drive. The way it did this was to turn every disk sector into an individual file or similar. Thus rendering the drive unbootable and in practice had the opposite effect from its name.

Shortly after I took out the 20Mb HDD on my dad's work pc with RECOVER he bought me my own little home computer to wreck instead :)


> Then I had to learn how to install an OS.

I basically followed the same path you did. Once you have to wipe and reinstall you have gotten to a comfortable level with your systems and tend not to fear making mistakes.


I believe the most important trait for debugging is stubbornness.

When Stack Overflow and Google do not help, you start doing it yourself. You use tools like a debugger. If the tools do not help, you can look for other tools. If you cannot find them, you can develop your own. You can look at your code, the code of your operating system, and every library you use. You can trace and observe every instruction a CPU performs.


Yeah, but it's also really bad for sleeping patterns. Countless times I've started to address a bug in the evening only to find it spirals down into some obscure feature I didn't even know existed. After hours of thinking and learning more about the language/tools/computer than you thought there was you feel great, but it's 3am.

In my experience there are only two ways to really learn about programming: teaching and debugging.


Annnnnd... that's how I ended up solving a really stupid bug last night at 2am.

I'll add curiosity to stubbornness too. I very much have to understand how a system is actually working before I'm willing to call it quits and call something successfully debugged. I've definitely worked with folks in the past who don't have that drive; they're more concerned about making it work than understanding why it does/doesn't work. My revelation last night was "deep in this (internal) library, a URL is being manipulated using a badly conceived regex, and that's why this URL is getting mangled".

The curiosity to understand it on a deep level pretty much defined my academic career too. I'd already been programming (in some capacity) for about a decade by time I turned 18 and finished high school. Computer Science was a pretty obvious career path. But I ended up doing an EE/CS dual major, entirely because I wanted to understand the pieces below software as well. A few years after undergrad, I realized that a lot of people struggled to debug complex "cloud" systems, and went back to do an M.Sc. to study the problem more carefully.

(And, per cup-of-tea's comment, I spent a fair bit of time during my M.Sc. teaching as well, including taking on a Spring & Summer session 200-level CS course as a sessional lecturer)


> but it's 3am.

Where do I sign? When I get really stuck on a bug it's sometimes taken weeks to solve in a never ending game of software dominoes. In very rare cases it ended up being a bug that you could only solve with a soldering iron and some creative rewiring of interrupts.

Bugs can be crazily obscured. To the point where you wish you hadn't started but the sunk cost of time invested is such that there is no way back either. This is one of the reasons I was so happy to switch to Unix (and later Linux), the fact that in most cases the program crashes with a core file and access to post mortem debugging (and in many cases a nice stacktrace of how you got there) made short work of most of the bugs. Of course that means you now only have the hardest categories to deal with.


> To the point where you wish you hadn't started but the sunk cost of time invested is such that there is no way back either.

"All causes shall give way. I am in code

Stepped in so far that, should I wade no more,

Returning were as tedious as go o'er."


Anecdote: fix the root cause of some really hard to get to bug only to realize you forgot what the highest level problem was.


Are you me? :D


I don't do that anymore. As a father of small kids this is simply not an option.

My substitute is to externalize my brain: Write things down. Journaling. Blogging.


Been there. Usually before I pull out the debugger on some third-party piece of trash product from the 90s (now owned and neglected by MS) I've come up with a way to work around the problem in 15 lines of powershell and a scheduled task. It ain't pretty, and not knowing what's wrong will eat at me, but it is preferable to wasting my life on problems that I don't really want to solve that badly.


Not enough people reach for a debugger anymore and are actually willing to jump through code, set breakpoints, examine variables in memory. Not enough people dig down instruction by instruction and look at the registers. Debuggers and code monitors used to be your primary development tool. I’ve seen younger developers look at me as if I were some kind of sorceror when I merely step through a C function or dump the contents of RAM. This critical skill seems like a dying art. Isn’t debugging taught in school anymore?


I actually rarely use a debugger. I used to but the speed with which current processors zip through code makes interactive debugging a lot less useful than it used to be, in very rare cases it is useful, but in the majority of the work I still do in the software world I've found that proper testing is what addresses the vast majority of the cases that would have hit the debugger in the past, and much faster too.

Debuggers are still useful but I don't actually recall the last time when I hit a bug that required interactive debugging.


> Debuggers are still useful but I don't actually recall the last time when I hit a bug that required interactive debugging.

Microcontrollers: As a side rant, I'm perpetually grouchy with the fact that Arduinos don't have debugger support. I was using AVRs well before the Arduino was a thing (still have my STK500 kicking around...) and I feel like it's a pretty big lost opportunity for hobbyists to not be able to do a deep dive into what is actually going on for when they want to "level up" their MCU skills. Printfs over serial are great, but they often dork with the timing enough to change the character of bugs; JTAG debugging does too, but you can often insert breakpoints with surgical precision to see what's happening without breaking the timing of what you're trying to debug.


Yes, in the embedded world I can see lots of applications for the debugger, that's a different world. Real time emulation made life a lot easier as well, compared to the old fashioned way (burn eprom. pray. power up.)


I've been reading the 'Hackers' book by Steven Levy, and the way it describes what makes hackers what they are is exactly this kind of tinkering and problem solving.

One thing that's been fascinating me this past decade or so is that clearly programming and lots of other 'nerdy' stuff is quite mainstream now, and as a result the 'old-fashioned' concept of a hacker (or the 'hacker ethos' or whatever) seems to have become less of a specific subculture.

But at the same time, I've been trying to do some 'code school' style teaching and I find that there's still a very distinct difference between the coders who mostly just want a job and see coding as a viable path (and even enjoy it, much of the time), and coders who have that same kind of 'hacker' mentality.

It's hard to describe, because it's not just about skill or 'good code' or 'productive results'. I guess it's some degree of unusual passion or excitement about the whole thing. And definitely a tendency to just 'hack it' and coming up with sometimes horrifyingly creative solutions to the problem.

For example, one guy I was teaching was incredibly messy in his code style, undisciplined, and clearly copy/pasting stuff from Stack Overflow on occasion. But when we talked about computers, programming, what kind of cool stuff he'd like to build, he was clearly different from the others. Where the others would sometimes diligently go through the Rails Tutorial chapter by chapter, with him sometimes the only thing that got him to actually learn was to give him something that 1) tickled his particular passions (games), and 2) was problematic enough that he'd be cursing at his computer until well after the 'official' study time.

And while his solution rather predictably was often messy, different from what he'd been taught, and somewhat unconventional, he usually figured it out and there was almost always some 'cleverness' to it.


Ungooglable problems still exist. Yesterday, for example, 30 Fanuc i industrial controllers started sending back RST tcp packets and refused to initiate new tcp connections until restarted. This behaviour occurred in a system which was stable for 2 years. Very likely an issue on one controller spread to the others by causing an error in a set of dlls from Japan. This occurred due to another set of the same DLLs connecting to 1 controller at the same time. Maybe. I have to talk to the manufacturer to get any further. Or disassemble their libraries. I have these sorts of ungoogleable problems all the time. I doubt industrial automation is the only field like this.


Industrial automation is my field also, and yeah - you get that interesting mix of low-volume hardware, software written by electrical engineers, and messy real-world situations. You think you know what's going to happen but you're never quite sure... And when something does go wrong, it'll be a doozy.


> software written by electrical engineers

I mentioned elsewhere in this thread that I did an EE/CS dual degree after having programmed for a long time beforehand. I really don't understand why, but some of the EE-only software I've worked with has been absolutely atrocious.

One of my favourite moments in undergrad was in my Digital Communications class. I don't remember exactly what we were doing, but it involved Matlab and some kind of signal decoding. I handed in my assignment and got called in to have a one-on-one with the professor; he was certain that I had cheated and just hard-coded the answers to my assignment instead of actually writing the decoder. Why? Because my code ran instantaneously instead of sitting and grinding for 30+ seconds. What was my secret? I'd seen a couple of triply-nested for loops in the code and decided they could all be accomplished with a matrix multiply instead of doing all of that computation by hand.

In the same class, a friend (also an EE/CS dual) and I came up with the dynamic programming solution to the Viterbi decoding problem as well, and again, got taken to task for having a solution that ran too fast. Sigh...


> Or disassemble their libraries.

Be warned that FANUC is very litigious about that sort of thing. I can't go into specifics, but they've sued companies out of existence for less.


Being able to solve problems like that one result in you looking like a wizard btw.

In case you haven't seen them, the SysInternals tools can serve as an interesting middle ground before popping the DLLs into an assembler. There's one (whose name escapes me) that is basically strace-for-Windows, and I've used it to successfully solve some pretty gnarly Win32 problems before.


Are you thinking of Process Monitor?


Just looked it up, and yes. Process Monitor and Process Explorer, depending on what kind of problem it is.


With a bit of web searching, that's what I found too.


That sounds like an instance of the Thundering Herd problem.


Thank's for making me look into that :). I'm half expecting a week or so of HN posts diving into this topic...


It can be pretty tricky to defend against this since it is mostly a client side issue. The best trick that I've found is to randomize the initial delay + exponential back-off for subsequent retries.


Every generation wonders that about the next one.

This is what the previous generation pondered:

I sometimes wonder if growing up before pre-built computers and lots of books about computers was actually a boon for my problem solving skills. I spent so many hours with nothing to go on but some schematics and my own persistent trial-and-error. If I'd done the same stuff these days I don't think I'd have learned half as much (although would I have done more, different things? Probably...)


Thing is, as someone probably a generation apart, at least, I don't disagree.

We all work within a context of abstractions, and in any given generation, poking and prodding at those abstractions and trying to understand what happens below it is immensely valuable and instructive. For one generation, below it might mean pen and pencil, and for the other it might mean looking into what jQuery actually does.

Let's not forget that 'pen and pencil' was once upon a time an abstraction inaccessible to many too!


Socrates complained that writing was ruining thinking and memory, IIRC.


A determination to debug beyond just googling is ABSOLUTELY IMPORTANT for a software engineer. You are right in that it was probably better when an engineer would have to learn these skills the hard way; but let me offer myself as a counterpoint: I loved that Google/SO allowed me to get started and debug common problems right off the bat; it reduced the barrier to learning programming. Once I gained sufficient proficiency, I felt more confident to delve deeper into code and lower level specifics to debug problems.

I guess what I'm saying is that the determined problem-solving mentality that were required before Google/SO are still important, but its possible to learn them even with Google/SO.


> A determination to debug beyond just googling is ABSOLUTELY IMPORTANT for a software engineer.

We're going to need you to time-box that fix. If you don't have a solution within a day, assign the Jira to Bob so he can take a look at it.


But... Bob doesn't know his ass from a hole in the ground! It'll never get fixed!

In my current gig, I guess I'm Bob. Whenever someone can't figure out the solution to a problem, it just gets punted over to me to figure out wtf someone did a couple of years ago that makes no sense but coincidentally worked for exactly the problem they had. /rant from last night.


That's the untested (by me) contrapositive to my story. Now you don't have to spend untold hours trying to figure out why that goddamn COM call is crashing, but also, you don't have to waste hours on trivial technical crap when you could have spent those untold hours learning more about how to build complex systems.

It ties into one of the best things about programming / software engineering / software architecture, which is that it's a self-scaling system. Everyone eventually finds the level at which it becomes 'too hard', so it's always a challenge.


Is this the Peter Principle[0] of software engineering?

[0] https://en.wikipedia.org/wiki/Peter_principle


JFYI, C.S.Lewis:

“Experience: that most brutal of teachers. But you learn, my God do you learn.”


I am so jealous. :)

I started out on a similar computer, but once I worked my way through the basic manual that came with it, that was it. I couldn't get any more documentation until I got my hands on an XT many years later.


Yeah, I really lucked out. The Vic was super old when I first got it, and we got an XT not too long later. Vic-Basic (3k of RAM) to XT (640k of RAM) was a huge jump. Then the owner of a local computer store I hung out at gave me a copy of Power-C for $20 I think. Not too long after, we got a Pentium 133 and I had a great box to dual-boot Linux on.

It was a town of 35k people, and I just happened to luck out and meet the right ones at the right time. The owner of the computer store that gave me the C compiler ended up hiring me for summers and weekends through high school, and then had me back for the first couple years of summer during univeristy too.


This is why I hate the quote "It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration." and "The teaching of BASIC should be rated as a criminal offence: it mutilates the mind beyond recovery."

I started coding by reading the huge instruction manual that came with the Amstrad CPC and didn't have an issue moving on to other languages later. As long as you're willing to keep up with what's new and are passionate about improving your skills it doesn't matter what you started with.


That sounds like Dijkstra, from EWD498 (1975):

http://www.cs.utexas.edu/users/EWD/transcriptions/EWD04xx/EW...

At least, the "mentally mutilated" phrase is about BASIC. The "criminal offence" phrase was actually about COBOL. You'll note that Dijkstra also had choice words about FORTRAN, APL, and PL/I, which were popular programming languages of the day.

It's tempting to say that Dijkstra was being tongue-in-cheek when he wrote this. I never met him, but based the attitude I picked up from his other writings, I'd guess that if you had asked him he'd have said he was completely serious.

And he has a point. I first learned programming in BASIC, and look how I turned out. (Well maybe I confirm Dijkstra's point. Others will have to judge.) But I do recall a bunch of characteristics of early BASIC systems that potentially led to terrible programming habits: GOTO and line numbers leading to spaghetti code; all global variables; no scope; no functions; self-modifying code (overlays); etc. All of these had to be unlearned at some point.


> But I do recall a bunch of characteristics of early BASIC systems that potentially led to terrible programming habits: GOTO and line numbers leading to spaghetti code; all global variables; no scope; no functions; self-modifying code (overlays); etc. All of these had to be unlearned at some point.

Compared to just being told these things are bad and should be avoided, I'd like to think that being forced to code in the above ways makes you better rounded as you have first hand experience for why those practices are a bad idea. It's not like you have much choice if you have to code in something like assembly as well.

I've worked with programmers before who can regurgitate the names of current bad practices but can't describe the reasoning behind them which isn't a good thing.

Personally, I think it's an absurd idea that e.g. if you began coding with all global variables and no functions you'd be unable code in a different way. If you're amazing at coding in an environment that makes it challenging to write complex programs, you're going to quickly become efficient at writing code in environments that makes it easier to write complex programs.


Sure, I don't disagree. Personally, after moving away from BASIC to other languages, I felt freed from a bunch of problems I had had with BASIC. For example, scope eliminated problems with reuse of variable names. Perhaps lots of other people who started with BASIC felt the same way.

On the other hand, sometimes other people never transcended those bad habits from BASIC. Sometimes you run across programs that are one big function, with variables reused, etc. I don't know what the thought processes are of programmers who write such code. There's an old saw, "You can write FORTRAN in any language." Maybe this should be applied to BASIC as well. That's probably what Dijkstra was driving at. I don't know about "mental mutilation" (certainly hyperbole, in my book) but only Dijkstra knows what he really meant.


> On the other hand, sometimes other people never transcended those bad habits from BASIC

The kind of people that don't actively keep up with current best practices and latest developments are always going to be bad programmers though. It's not like you can pick a perfect language to start coding in that will teach you all the current and future best ways to do things. I personally don't think it matters what language you start coding in as long as you're always learning and adapting.


That is how I started with Java. I mean I started 5 years ago but I quickly learned other ways by simply running into intractable problems when you try to work that way. Modern programs end up so complicated that having hundreds of lines of crap in a single function is impossible to reason about and causes you to write bugs into everything.

I think it was working with others and pulling apart code and shrinking it down until it does one thing and one thing alone was the most important learning experience. Dijkstra might be one of the smartest computer scientists ever but I don't think that his assessment of learning that way was correct.


There were many different flavors of basic and some were actually quite good. GFA Basic on the Atari ST and BBC Basic on the BBC Micro for instance.


I wonder: technically Basic (MSX Basic to be precise) was the language I wrote my first working program in, but I've learnt to program on paper. I am serious: we had no computers in school, but we had a programming course and a teacher who took that seriously. So I've learnt about variables, loops, control flow etc. by writing pseudocode on paper. When I got access to the computer all I needed was a syntax. So is my brain mutilated or not? :|


I think that maybe webdev, maligned as it is, fills that role. You can make something useful quickly. "Boot times" (page reload) are short. State can be saved but is ephemeral by default.


I agree, webdev is the modern equivalent of a fresh new playground.

I started with computers in 3rd grade, logo. It was enthralling. I was enamoured with computers right away. I learned you could stop some Apple games and view the code. I learned you could even print it, (I had reams of printed code I didn't know what 99% of it did, what is a gosub? lol) most of this learning was by accident, as I didn't have a single book to teach me anything, and only 1 other real nerd friend who showed me how to program lowres graphics. (One friend got bootleg copies of games from his older brother though, lol. Anyone remember punching holes on the side of a floppy disk to make it store more data?)

I tried over and over to create games, and made dozens of odd text adventures and stupid space invaders games that crashed after a couple minutes of play. In highschool I found (by watching the older kids) about hacking system resources on Macs, and we'd mess with the menus, lol. Pascal was interesting, but I never learned to make anything cool with it. Finally when I got into college, I got some real coding experience, and found out you can mix art (I was a flip-book animating nut as a kid, made hundreds of them) with programming.

Eventually the web took over, and I found out you didn't have to "install" anything on someone else's machine ever again, I was hooked.

As bad as the web is in so many ways, I never want to go back to supporting other peoples computers ever, ever, ever again.

I will take JS hell over that garbage. (Imagine helping a client debug an install issue on a computer over the phone, and he's 2,000 miles away on a Carribean island living it up and blaming your for him clicking the wrong button because he simply refuses to follow your instructions "explicitly" and can't ever be wrong... ugh)


Imagine helping a client debug an install issue on a computer over the phone (...)

Instead, that client is using IE6, doesn't know how to tell you that, and IT restrictions prevent them from upgrading or using another browser. Or they have Chrome and a shitty addon that breaks your site in a way that you can't even imagine. Or they are behind a workplace firewall that blocks some specific ___domain used by the site.

I've done support for a webapp, and I've had to use remote desktop sharing quite a few times.


>Instead, that client is using IE6, doesn't know how to tell you that, and IT restrictions prevent them from upgrading or using another browser.

At least _I_ personally can do something about it, instead of grasping at straws and begging someone else to do something right on their end. If they want to use IE, fine, but they only get IE supported features and everything cost 10x as much, win-win.

>Or they have Chrome and a shitty addon that breaks your site in a way that you can't even imagine.

Dealt with this repeatedly,"sir, try Firefox, it works there? Great, use that instead, or remove the Chrome plugins."

I had a client that was in the Beta channel of chrome and had some obscure bug causing her issues, but no one else. I figured that out just fine.

Just last week a client had the stupid Grammarly plugin wreck something online, fixed in minutes.

>Or they are behind a workplace firewall that blocks some specific ___domain used by the site.

Say, "Call your tech support guys, I am unable to fix that." I can't say that to a guy that is holding only a CD that got mailed to him, because I am the only support then.

>I've done support for a webapp, and I've had to use remote desktop sharing quite a few times.

Yep, and in the near future this will work in the browser, we are close now. And the beauty of the internet is that if they don't have access, we don't have to debug until they do. (their ISPs/intranet problem) But in the old days, it didn't matter what the issue was, and if the computer was not online, then the only data I could get to debug from was irate staff.


You can't just turn on a machine and "webdev".

You need a TCP/IP stack, HTTP server and client.

You'll need an editor & graphics tools. Sketch and Zeplin have a pretty steep learning curve. You'll probably want Webpack or Gulp.

That is, unless you're writing HTML 1.x, which gives you such exiting tags as <BLINK> and <P> and <HR>.

On the other hand, I can fire up a Commodore 64 or Atari 800 with built-in BASIC interpreter and do actual math. I can process input and interface with hardware at the lowest level. I can generate graphics (in multiple graphics modes). I could, in theory, write that same HTML 1.x code (in BASIC) with a set of routines to generate and save the very same.

Financial investment is lower, and I'd say the return on that investment comes 10X quicker than with a Mac running Sketch, Zeplin and YouTube tutorials.


I started with a Vic-20. I wrote a BBS for it with multiple rooms (message areas), private mail and an online game. Users could start their own rooms and make them public or private. It was very popular with each user spending an average of 70 minutes on it.

That's how I got into programming professionally. One of my users said "Anyone who can write a BBS for a Vic can program!" and hired me as a programmer. Thirty-two years later, that same guy now wants me to work with him at Google.


This one speaks the truth.


Hey rachel, the bbc microbit definitely fits the parameters you listed at the end of this post. Only $12, so definitely worth picking one up to fool with.

https://en.wikipedia.org/wiki/Micro_Bit


You can't do anything with that without first connecting it to a bigger computer.


Javascript and reloading a web page are the obvious parallel. There are one liners for changing colors, fonts, etc. A tiny bit of canvas setup and you can draw freely. I think sound is slightly more complicated, but not much. It might make sense to make a little js library (maybe call it basic.js, though I'm not actually recommending copying the BASIC language) and then write some blog post that start with loading it, and then fooling around in the JS console.


In a way, this is a big part of how I transitioned from someone who knew how to program a Commodore 64 to someone that could help people build web pages (and eventually be able to do other things good, too.)

I had a Geocities web site, and I loaded it down with JavaScript widgets that I wrote. Things to flip images around. Things to annoy you with alerts. Then I spent countless hours in MS Paint making images out of bitmaps which I then wired up using JavaScript to emulate an Iron-man watch (complete with glowing and a very bad battery!)

There was "no risk." It was fun and exploratory. It didn't achieve anything useful for anyone else, but I learned and got confident in learning (and experimenting.) When someone actually hired me and threw ColdFusion at me, it was wildly different... from QBasic and from JavaScript... but I just experimented and figured out how to do CRUDDY things.


Nice, my dad was quite prescient as well and bought me and my sister Vic-20s with little black and white matching TVs. I have great memories of hacking the random sentence generator in BASIC and loading/saving programs to cassette tape over many minutes.

A teacher so far ahead of his time at school had a TRS-80 that we would have lessons on. It would be at least 5 years later that I would touch another computer, a PC XT running DOS.


Installing mklinux in the late 90s on an old 6100 by transferring disks copied over the internet that took a few days each, while everyone was sleeping, and sometimes the disks failed, then getting the damn thing installed.. but then it worked! Gave me a reward for my perseverance, and I've been getting that reward since as I continue to stubbornly chase bugs and fix genuinely interesting problems.

I learned by trial and error, before you could just google a problem and have it pop up in StackOverflow. It taught me one major lesson: debugging is how you learn about everything in the environment. A bug is an opportunity to explore the programming environment, framework etc and you'll get a grasp of the platform through searching for solutions. StackOverflow is nice for quick fixes but it doesn't give you the deep knowledge that debugging does. I recommend to my team, if they find a bug spend half a day figuring stuff out from the ground up (where we have that time luxury) and it makes everyone more knowledgable in the long term.


Reminds me of something I wrote a while back: http://blog.jgc.org/2009/08/just-give-me-simple-cpu-and-few-...

Those of us who started programming in that era had simple, relatively cheap, fully comprehendable computers in front of us.


I agree with most of the sentiments, except for one key statement:

> "Just about any computer you can buy today is going to have some kind of non-trivial boot time. It probably will require a whole ecosystem of crazy things to make it work. It will be fragile. You can break it easily."

I vehemently disagree with the italicized portion, at least in the way it's presented.

Those on here probably aren't affected, but many of the non-technical users I know are afraid of learning how to go beyond simple tasks on their computer (word processing, internet browsing, etc.) because they're afraid they'll break their expensive box and have to pay someone to fix it.

Most UIs today seem to help users avoid breaking things and typically have some sort of notification if they'll change something that would break their machine (e.g. UAC in Windows, padlock in OS X).

Even beyond this, general application development wouldn't pose these risks. Hardware hacking certainly could, but any time you're dealing with the physical workings of something that's an assumed risk.


I think "break it easily" does not refer to actually breaking the hardware but ending up in a situation where you cannot come back to where you were before. My mother frequently called for my help when she made an application fullscreen and could not go away from it, and she considered the computer broken at that point.

There is also numerous ways you can make computers unbootable by removing files, installing the wrong thing that runs on boot and so on. I maybe wouldn't say it's fragile and can break easily, but it can, and if you just go around clicking on things, it's probably gonna break after a while.


That's a great take, didn't think of that. Thanks for your response.


I've actually been pondering this problem for a couple of years - when everything is based on FLASH, what do you do when something corrupts your "base install"?

The VIC-20, C64, Timex-Sinclair and other computers of their day had the OS in ROM that couldn't be changed. So you couldn't break these computers via the keyboard (or other peripherals) but if there was a bug somewhere in the firmware (which included the OS and BASIC) you had to live with it for the life of the device.

You can get back to those days via emulation - https://vic-20.appspot.com/emus.htm, but there's nothing (that I know of) that's a physical device with these characteristics.


You can protect pages on most flash devices (e.g. internal microcontroller storage). Lock the bootloader and factory reset image, so you can still update the running firmware, but also have a way to recover from any scenario with a hardware button or sufficiently clever watchdog.


Why aren't cloud, or even local, VM's being brought up? You irrevocably blow up your linux VM... just spin up a new one - or revert to your snapshot from earlier in the day. I'd actually make the argument that virtualization would make it easier than ever to safely experiment with things. Complexity has definitely gone up, and the time and ability to peacefully munch on problems under a tree has definitely gone down. But, I would argue that if a learner given the right sort of virtual sandbox, instructions on how to reset their state to working order, and an internet connection with google, the linux docs, and stack overflow bookmarked, there is a lot of learning to be had!


It's a Chicken and Egg Problem.

Bootstrapping an 8-bit Microcomputer is so much simpler than the first order of operations in the hosted VM model.


> I suppose you could come up with some simple machine which plugged into your TV and let you experiment with a handful of primitives to get a feel for starting out.

A modern equivalent might be the Micro Bit (http://microbit.org/) - it doesn't plug into a TV but it does have a rudimentary display built in, it's programmable and it has various sensors and I/O on it.

Also the entire Arduino ecosystem fills a similar niche, of a bare-metal-ish platform which lets you jump right in and start coding things.


They really don’t fit the same niche as the little machines that you could plug into your tv and either load up some simple arcade games or produce your own using Basic.


That's not really a niche, though, any more. It's almost cheaper to just put an LCD on something than to interface with modern TVs.

If you want something more in that spirit regardless of hardware, maybe PICO-8? It's basically a (virtual) minimalist 8-bit computer / gaming console for the modern era.

https://www.lexaloffle.com/pico-8.php


The author asks whether her principles are relevant today, and I think they are, at least for someone like me. Firstly, starting by poking around a small issue that interests me generates motivation: as soon as I have made some progress, I find lots of things that I want to look into more deeply. Secondly, it gives me confidence that I can master the topic, an antidote to the sinking feeling I get from looking at a multi-page table of contents.


Does anyone make a c64-ish SOC with SID style sound limitations, HDMI out, 1/8" audio out, 8-16 GPIO pins, good documentation and development environment (C/ASM)? I wish a simple (restrictive) mid-range system like this existed to develop on as a hobbyist. The Pi is too much, arduino doesn't have the "fun" (interesting sound chip, built in outputs, marketing/documentation).. Does anything like this exist?


I can highly recommend MIST: https://lotharek.pl/productdetail.php?id=97

Works like a charm and can run C64, VIC, Atari ST, various older computers like PET or TI99/4a or zx81/spectrum.



Interesting!


I think it's because everyone who wants something like that just buys a C64 (or NES, or whatever)


Yes that makes sense, I wish there was a market for these kinds of systems at the arduino form factor / cost point with more of a hobbyist bent.


Arduino is the spiritual successor she's looking for to the C64 era experience.

It worked fantastically for my kids and brought back waves of nostalgia watching them grok the basics like I did.


Honest question: can anyone explain who this writer is, and why her posts keep hitting the front page so frequently these days? There's no bio on the site so I'm guessing there's some kind of Valley insider knowledge going on as to why her opinions are considered significant. Sorry if that's a dumb question, I just don't see the appeal of these musings and I'm wondering if I'm missing out on some crucial context, other than the fact that she's old enough to remember the 8-bit days.


She was an engineer at Google, then a freelancer for a bit, then an engineer at facebook. Her debugging skills are otherworldly, and she is a decent writer, which helps her make Hacker News a lot.

She also has dealt with a lot of the social issues inside Silicon Valley, and has some pretty good insights there as well.

Perhaps 1/3 of her stories are cool "How the hell did she figure that out?" debugging stories. 1/3 are social commentary, and the remaining third are about odd bits of technology, like this post.


So good content gets upvoted.

I read them and upvote the ones I find interesting.


I like her posts and read/upvote them whenever I see them. It always seemed natural that they'd be on the front page of HN. It's the sort of content I'm here for.


Same for me. I like her writing stile and the content.


I think part of this is the 'frontpage echo effect'. Quite regularly there'll be a few days/weeks of articles about either a particular topic (that is not time-sensitive), a particular theme, or from a particular source.

Sometimes it's submissions of interesting links from the discussion of a previous article (I've done that more than once), sometimes it's from people who decide to read past entries from the same source (or add said source to their reading habits). And I'm sure there's other reasons why this happens.


That’s part of the charm of HN: scrolling through the top articles you’ll see brand new experimental software tool, a NYTimes piece about some world changing technology, then some rando engineer’s personal blog—right next to each other. Much more variety than the “top” search results elsewhere.



Came here to ask the same thing. Thanks to to others in thread for providing some context. Personally I haven't found any of the posts from this site to be interesting/entertaining... but please link the ones people think are.


It's okay for you not to be interested in the kinds of things she writes about. Different hackers are interested in different things. But it turns out that a lot of hackers are interested in the things Rachel writes about, and we enjoy her plainspoken way of writing about them — confident without ever approaching being arrogant or condescending to the reader. (Sometimes they are pretty harsh on her previous coworkers.)


The HN search is the easiest way to find the most popular submissions of a certain site:

https://hn.algolia.com/?query=rachelbythebay.com&sort=byPopu...


Or you can click on the ___domain url from the frontpage (easier, though it doesn't sort by popularity so is less helpful in that regard for larger sites) - https://news.ycombinator.com/from?site=rachelbythebay.com


She is a very insightful and technical writer. She's also seen a lot of current big business, and so she is the insider's outsider really.

I thoroughly enjoy her writing and every time it is posted to HN I read it.


I was ~literally~ about to create a post (when there were no comments) describing exactly this. But when I refreshed I saw your comment. I am wondering the same.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: