I was a computer geek from about 1981 onwards, starting with Commodore PETs at high school. These machines were uniquely accessible. You could do surprisingly creative graphics using only the graphic characters hardwired into the machine (no bitmaps, colours or sprites). But these characters were printed right on the keyboard, and could be used in normal BASIC "PRINT" statements to do quite performant animations and such, without any advanced programming skill.
That was as good as it got. The difference between a beginnner dabbling with his first PRINT loop and an advanced assembly language programmer wasn't that great because the machine was so limited; you could go from one extreme to the other in skill in a year or two if you were interested enough. My natural response to seeing the "MISER" adventure, after solving it of course, was "I have all the programming skills to make a game like that" and did so.
And while then as now, only <1% of the general population was interested enough to get good at programming these things, another 10-20% was interested enough to hang around, dabble, try the latest cool programs that came down the pipe, or were made by the 1%. I had people playing a game that I made that consisted of one (long) line of BASIC.
Then, it seemed, most of the rest of the population (the other 80-90%) got computers too, Commodore 64s mostly where I lived. And still, even if only a tiny minority actually programmed their own stuff, it felt like a part of a vibrant scene, you could always show off your stuff.
With modern machines, even without the discouragement of locked down operating systems, missing accessible programming languages and such... there is just no hope of making something "cool" you can impress your friends with. So the tiny minority that does learn HTML5, Javascript, Arduino programming, what have you, is relatively obscure and seems to be a shadow of its former self. It isn't really. It's just that 99.99% of the computing power now in the hands of the population will never be tinkered with.
>there is just no hope of making something "cool" you can impress your friends with
I believe there is hope of making cool graphics effects with GPU shaders. GPU tech is still advancing similar to how CPUs were back then, so people have not yet explored all the possibilities of the hardware. You can do impressive things just by experimenting even if you don't have a lot of theoretical knowledge. Sites like shadertoy.com make it easy to get started.
> With modern machines, even without the discouragement of locked down operating systems, missing accessible programming languages and such... there is just no hope of making something "cool" you can impress your friends with.
Every modern computing machine I know allows you to add a very rich set of coding tools, other than unjailbroken mobile devices perhaps. Yeah, poor access to mobile sucks, but if you feel the need to hack, you won't let walled gardens stop you.
Web scraping is an insanely rich source of data for building cool tools. Then you might add a bit of simple machine learning, like a voice-based interface or a GAN-based video face-melter or visualizing web-scraped results on a map. These sorts of tricks were hard or impossible to do 10 or 20 years ago. But not today.
If you want to hack, I'd start by immersing yourself in Linux. Or Raspberry Pi. Better yet, both.
You are looking at this through the eyes of someone who is already comfortable with their understanding of computers and software. To a kid with no prior knowledge, the entry barriers might appear impossibly high.
No, they're lower than ever. They're one google search away from a huge amount of usable resources, youtube videos, web IDEs that they can use on their locked down iPad / School Chromebook and more.
And to do anything non-trivial requires years of tediously building up a knowledge of all the abstraction layers and dependencies between you and the machine. For a long time there is nothing you can do that a million others haven't already done far better and made readily accessible.
I can personally attest to what a massive turn off this is. I grew up in an age when computers were a part of everyday life, but their inner workings - hidden behind a mountain of ugly obfuscation. If you find the rare trivial task like making stamped websites fascinating, good for you. But for people like me who couldn't care less, it takes some sort of external impetus to actually discover their interest in computing. In my case it was a half assed mandatory programming class in engineering school where I found out I had a talent for it, and discovered an interest in the inner workings of things I had been taking for granted all my life.
Just because resources are easy to find doesn't mean anybody cares for finding them.
That was as good as it got. The difference between a beginnner dabbling with his first PRINT loop and an advanced assembly language programmer wasn't that great because the machine was so limited; you could go from one extreme to the other in skill in a year or two if you were interested enough. My natural response to seeing the "MISER" adventure, after solving it of course, was "I have all the programming skills to make a game like that" and did so.
And while then as now, only <1% of the general population was interested enough to get good at programming these things, another 10-20% was interested enough to hang around, dabble, try the latest cool programs that came down the pipe, or were made by the 1%. I had people playing a game that I made that consisted of one (long) line of BASIC.
Then, it seemed, most of the rest of the population (the other 80-90%) got computers too, Commodore 64s mostly where I lived. And still, even if only a tiny minority actually programmed their own stuff, it felt like a part of a vibrant scene, you could always show off your stuff.
With modern machines, even without the discouragement of locked down operating systems, missing accessible programming languages and such... there is just no hope of making something "cool" you can impress your friends with. So the tiny minority that does learn HTML5, Javascript, Arduino programming, what have you, is relatively obscure and seems to be a shadow of its former self. It isn't really. It's just that 99.99% of the computing power now in the hands of the population will never be tinkered with.