Hacker News new | past | comments | ask | show | jobs | submit | sitkack's favorites login

I somehow memorized most of the function key combos for wordperfect when in high school. And it wasn't like I was doing loads of reports for school. I guess it was a combination of there not being much else to do with the PC I had than playing the few (copied) games I had and fiddling with my copy of Wordperfect 5.1 (my neighbor who taught using it, supplied it). In any case, bored as I was, I explored a lot of the feature set and wrapped my head around features I would never use. Like creating mailings. I don't think I've ever actually had to create one. But it was there so I dove into it. For the post-internet generation, this is how you would create snail mail spam campaign in the early nineties. Merge a list of addressees with a letter with the right codes and then print personalized letters.

I also had a hand me down commodore 64 before that. My uncle donated this when he got his first PC. I taught myself basic on that. And with a few peeks and pokes managed a simple game even. Alas, I had no disk drive and never thought to actually save my creations anywhere. Like on the tape drive I did have. The commodore 64 was great though. And my uncle bundled some introductory computer science stuff with it (a primer on bits and bytes) that along with the excellent C64 manual went a long way to got me into programming. My local library was useless. I had no access to information. There was no internet (at least not accessible to me; I had not even heard of it). But that C64 manual got me curious and I had nothing better to do. I did not realize it at the time but that bit of commodore 64 documentation and computer science intro is what changed my life.

The PC I got after that was relatively boring because it did not include anything useful in terms of documentation. Starved of information, I dove into Wordperfect.


Story time. In the last total eclipse, I was commuting from Boston to San Francisco a bunch and planned my flight to coincide with the path of totality. I brought enough eclipse glasses for the entire flight. The flight attendants were kind enough to distribute them and even gave them to the pilot and copilot. The flight crew was excited about it and actually got approval a change to their flight plan so that they could bank the plane so that people on both sides of the plane could actually look out the window to see the eclipse. This is back in the days of Virgin America, and as a thank you they sent me a little desk statue of a Virgin America plane. I keep it on my desk in fond memory of my favorite airline. Also got some cool photos of the flight crew and passengers all wearing eclipse glasses. (https://mackman.net/va.jpg)

Here's a real world example.

On a product's multi-year death march, I look at the next bug assigned to me for the project-within-the-product I was on.

Over an hour of diligent debugging revealed the problem - the C++ code meant to do

    X = Y;
but someone had typed and committed:

    X == Y;
The destructive value assignment became an innocuous comparison, whose result is immediately ignored.

I decided to search the rest of the tens-of-KLOCs project for similar assignment-turned-comparison statements.

That's mindless and tedious work perfectly designed for a computer.

Several minutes later, after weeding through the false positives, I created bug entries for any offenders.

Did I stop there? No.

I connected to the source server for the entire product and kicked off the same search.

When the search finished, I separated the buggy wheat from the chaff and created bug entries accordingly.

This happened on the weekend.

When the product triage team met the following weekday morning, they saw all the bugs entered across the several projects in the product due to the same root cause - double equals instead of a single equals sign.

Management decided to take the next step. They bought a site license for a static source code analyzer. We integrated the analyzer into our project build process and ran the analyzer on each build and triaging accordingly.

Highest compliment I got for creating all this "extra work": "F*ck you!" said with a smile.

Did I stop there? No.

I kept my eyes and ears on the lookout for more possible typos. I would read commit emails and resolutions to bugs to see if the cause/fix would fit the model of "easily-found-via-grep".

I expanded my batch file to cover new cases and created new bugs when the batch file found them.

Did I stop there? No.

Eventually I hit the wall of diminishing returns for source code analysis via regular-expression searching.

I looked for a tool that could go to the next level.

At that time, the one scripting language available in our project's build tools was perl.

Off to the bookstore and I bought O'Reilly's Pink Camel Perl book.

A few hours later, I had a rudimentary source code analyzer that looked at lines of source code for typos.

I added more cases as appropriate.

But C/C++ source code isn't rigidly formatted and programmers wouldn't play nice and limit their code to one code statement per source code line. Preprocessor macros also confused the line-at-a-time script analysis.

So I bit the bullet and expanded the perl script to "parse" C/C++ code.

I added checks to make sure memory allocations were checked for failure (no exception-handling for memory failure in our codebase).

Did I stop there? No.

I publicized the script within the company, answered questions, listened to suggestions, and offered help to other product groups in the company.

A couple of other product groups integrated the script into their build process. Obviously it'd be better if each programmer would run the script before they committed modified code.

Did I stop there? No.

I had worked on the company's new signature product before the current project's death march had started. I still had access to that signature product's source code. So I would run my perl script against the product's source code. That product is huge. A source code scan over the network (my work machine didn't have enough disk space to enlist in every project in the product) would take all night. I had access to the bug database but I didn't know which source code directory mapped to which project in that product.

So I did the next, most annoying thing - I sent a cleansed list of defects to all the developers in the product.

Did I stop there? No.

The company had bought a static source code analyzer product and proceeded to integrate it into most of the company's products' development process. They created a stripped down version of the source code analyzer suitable for developer use before committing code. My perl script was obsolete now.

But that didn't mean others couldn't benefit from it even though the company didn't need it anymore.

I noticed a "call for papers" notice for an open source conference.

I emailed the company's legal department and requested to open source my perl script. Their reply: "Permission granted."

I wrote my proposal, sent it into the conference organizers, who accepted the proposal, wrote a talk, and presented at the open source conference.

Did I stop there? No.

Talk is nice. Code is better. I published/uploaded the script on CPAN (the Comprehensive Perl Archive Network - https://www.cpan.org ).

Did I stop there? No.

Around the time Coverity started scanning open source projects, I had downloaded the source code to several prominent open source projects and scanned their code and sent emails with possible bugs where appropriate.


I honestly think we will one day look back and see this as taking a hammer to a screw. Sure it will work, but its a horribly brute-force way of doing it. The brain almost certainly processes language in a more elegant way that, once understood, certainly won't take up such raw computational power.

The brainwaves associated with a wakeful state (and thus generating and parsing language) happen around 40 Hz. Our machines are running in the GHz region. We are missing some fundamental insight.

At the risk of ending up front and center in the hall of prediction hubris, here is my bold prediction: By 2040 it will be possible to do what these TPU's are doing today on your consumer grade computing device, but it won't be because of a dramatic increase in the computing resources available to such devices.


> That’s 2500 to 7500.- CHF worth of scans if I get a third party to do it.

This is how I ended up with my first laser printer.

My aunt was asked to publish a limited edition of a book she translated to German. The printer wanted a ~1000 dpi film and we got quotes on how much it'd cost to get the protolith done. We then concluded a good laser printer that could print on transparencies would be far cheaper and fit our budget, so we asked if that would work for the printer. When they said it would, we got the printer, I tweaked the halftoning a bit and off we went.

In the end, the German translation we did was perceivably better quality than the professionally typeset Portuguese version. And I got a nice laser printer.


My guess is PR.

Regularly, on imgur, you see a pic in interest for a celebrity, a rich person, a movie. It looks organic, but if you look closely, there are plenty of weird things about it. Then it disappears as suddenly as it arrived.

I believe that they sell the front page to PR firms that need to promote something in a way the people think themself came up with the hype.

It's probably the same for a lot of communities with a strong influence on trends, like popular sub reddits or hacker news.

There is no better ads than the one you don't see. There is no better slogan than the one you repeat to your friends as a catchphrase. And there is no better propaganda than the one based on ideas you thought you had by yourself.


>> If you or he still has his notes, would you be able to ZIP them up and upload them to archive.org?

Not sure why, we were the only 2 people on earth to ever use that. To summarize, he used single letters for verbs and then single letters for register names: a, b,c, d,e, h,l for the 8-bit registers and x,y,z for the 16 bit pairs (b,c) (d,e) (h,l). It turns out you could always figure out some implied stuff. MAY was move a to y, but since A is 8 bits and Y is 16 this meant Y was an address, whereas MAD would be move A to D as an 8 bit register-to-register copy. This lead to all mnemonics being 3 characters or less. There could also be 8 or 16 bit immediate values. The disassembler used a 4 byte decode table - 3 characters and a size (0,1,2) of the immediate data. It was super clean. I'll post is somewhere some day. I still have my Interact computer in its original box waiting to find a nice museum somewhere. And lots of tapes including a bunch of little kid created basic programs. Hard copies of all the "Interaction" newsletters from the local users group too.


The funniest demonstration that I watched was at the computer museum at the University of Stuttgart (it's just a single room, but it contains a lot of history!). The guide took an old, butchered radio that was reduced to a coil attached to a speaker and put on top of the front panel of a PDP-8. Then he started a Fortran compiler, which would take several seconds to complete. During that time, the radio made kind if hideous digital beeping noises from the CPU's EMV radiation that got picked up by the coil inside. You could easily learn to distinguish different compiler phases and tell whether the program made progress. The guide explained that this was a common way for operators back in the day to keep track of the jobs they were running while taking care of other tasks: were they still running? Did they get stuck? Did the job complete and is it time for the next one? Some inventive guys figured out that when you wrote certain instruction sequences, the EMV noise would become tonal and the pitch could even be tuned to some extent. That got them to write programs which would compute nonsense, but when you picked up the EMV emissions, you would hear music! The museum guide ran a few of these programs to our great amusement :).

I've yet to see this mentioned - or demonstrated - anywhere else.


I think a society must first define their values before trying to align to them.

Elimination of pain and suffering as a primary value, changes everything. A superintelligent overlord given that objective would simply eliminate all life on earth. If preservation of life was a value, it would sterilize everything and put all existing life in a coma. If the objective was happiness, it would drug us during the coma.

But it would not be so dumb as to choose those things. I don’t know what it would choose exactly, but, I think it would have something to do with the purpose of life itself in this universe, which can be seen as a fundamental force to fight entropy at the mesoscale between electromagnetism and gravity.


Any FizzBuzz example almost feels too silly to deserve a serious comment. But here goes.

As you noticed, the FizzBuzz pattern has a period of 15. So if you want to go fast, just go directly to a 15x vectorized unroll with unaligned stores.

Assuming 32-byte vectors like with AVX2, you initialize two vector registers v0, v1 to hold the pattern of 15 4-byte values (the 16th value doesn't matter) corresponding to 0 through 15. You also have corresponding delta vector registers d0, d1 whose entries are +15 for the "pass-through" values and 0 for the -1/-2/-3 sentinel values. The loop kernel is just this:

    store_unaligned(p, v0)
    store_unaligned(p + 8, v1)
    v0 += d0
    v1 += d1
    p += 15
And of course you still need the remainder loop.

You could remove the unaligned stores with a monstrous lcm(15, 16) = 240x unroll (16 shifted instances of the above kernel) but you wouldn't be able to keep all the necessary loop state in registers.


Yes, but after buying and reading this paper, you will be able to determine that you have bought it.

Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: