"My supervisor observed that I was anxious to learn new things. He assigned me to the Discoverer project to develop system schematics, interconnection diagrams, and wire harness definition. I was working for the Electrical Systems engineer. His was a very responsible and highly visible position. And like most engineers working in this new field, he was in his early 20’s, just three years older than me. There were senior engineers in management positions, but there were no senior spacecraft engineers. Everyone was learning on the job."
Parallels to Silicon Valley are easy to find. It's a valid career path: if you want to progress fast, find a hot young area where there are no senior experts and work hard to keep up.
Wow, if I'd had a situation like this when I was in defense I might still be there. Instead, I had managers old enough to be my father, no upward path until I "put in my time", and an attitude that wanting to move around within the company was "disloyal" to my current management.
I hope people don't fixate on this and see this as the issue:
> Instead, I had managers old enough to be my father...
This...
> ... an attitude that wanting to move around within the company was "disloyal" to my current management.
... is bad management and bad company culture and are the reasons you get older managers who are inflexible and just riding out a paycheck. Those are the areas to fixate on and if you see it, leave. Nobody should have to work under those conditions. Working for an older manager shouldn't (and rarely is), in and of itself, be a red flag.
I think it's slightly more complicated. Having an older manager/supervisor is not a bad thing. Working in a department where you need to have put in 12 years to become a supervisor is a problem.
The old automotive company I work for released a "strongly recommended" schedule of different responsibilities that an engineer should have held for what time to be considered a strong candidate for promotion. There was a similar document I heard about for going from supervisor to manager. Combined these schedules said that no one would ever work their way up from entry level to department head. Ever. Therefore the only way to get there is outside of the official system via office politics, and not being honest about that is pretty dysfunctional on its own. That said, a symptom of that is the number of supervisors and managers who have 30+ years in...
> I think it's slightly more complicated. Having an older manager/supervisor is not a bad thing. Working in a department where you need to have put in 12 years to become a supervisor is a problem.
Exactly. I was too brief with that point, but the issue is first-line managers, in some cases, having more experience than I had years of life at that point. It's an environment that seems to condemn people to life as a minor cog in a huge machine, where your opinions aren't respected and you have no hope of making it into any sort of management responsibility.
Yeah, I realize now I should have committed more words to explaining that part. An older manager per se was not a problem. It was that fact that every line manager had 15+ years of experience, and thus this level of tenure appeared to be a requirement to actually advance. It crushed any feelings of ambition, because there was literally nowhere to go even if the other management problems didn't exist.
I’m sorry to hear that. The defense companies I’ve worked for, big and small, have fought to retain employees. Whether it be the end of a contract or just wanting to change the type of work you are doing, these companies fought to find you another position versus taking your clearance to a competitor.
The Silver Tsunami is not kind to DoD employment and to hiring managers. The young-uns look around, see nothing but old white dudes, and then head to anything resembling a FAANG. Livermore has had a heck of a problem with it. Even the 35+ year olds are trying to make like a banana.
I sort of started my career in finance like this. Never sought or intended to be on finance, but I could identify and solve problems the firm(s) I worked for that they didnt even know they had. Used to quip "when I'm doing my job well, noone knows that I exist. The moment I fuck up, everyone knows so exist."
You often can get a lot of money I’d you threaten to leave. But in my experience, it’s a lot more profitable to work in the front office and have people know exactly how much money you have made for the firm.
This is sort of parallel to very young network engineers and NOC staff, in like 1996, discovering for the first time what an ASN is, what BGP4 is, etc. And the commercial growth of the modern Internet.
Areas in cyber security. Ex: Automating simple but highly repetitive tasks for incident responders will win you lots of friends and generate lots more opportunities if you want it.
I think applications of AI to creating 3d modeling assets for videogames. Gaming is a >100 billion dollar industry and art can be as much as half of costs for AAA games. It seems to me like it's the lowest hanging fruit of AI in terms of difficulty vs potential revenue.
MS Flight Simulator 2020 seems to be doing this - creating 3D maps of all structures/trees/etc out of normal Bing world flat maps. Not sure how much human touch is required afterwards, but if those early videos are representative then damn that's amazing and brings realism never seen before
That's one example, another one is for smaller assets - trees and vegetation, for example (SpeedTree). There's definitely a market for creating 3rd party tooling to help with developing a game. There's actually quite a few companies and technologies that specialize in one aspect of a game or game's world.
There's a few other interesting technologies that add a lot of content to a game for relatively small investments; think physics and destruction engines, animation technology (iirc there is something developed by EA's sports games branch that can generate intermediate animations for smooth animation transitions), world generation (height maps), etc.
From where I’m sitting, it seems like most of the money to be made in the next 10 years won’t come from high tech, but rather from applying bang average tech (Python, React, Go, and standard AWS infra) to industries where the most experienced specialists have no concept of what those words mean (e.g. healthcare, agriculture, finance). There are huge opportunities to make these industries more efficient, and make money along the way.
There are so many industries that aren't going to be 'disrupted' by some fancy new tech. Because they have much more urgent low tech problems to solve first.
I've lost count in the number of startups that want to 'disrupt' agriculture with blockchain, because, you know, who wouldn't want to have their potatoes tracked in a decentralized proof-of-stake ledger? PotatoCoin! But reality is that to store data in a blockchain, you first need data to begin with. Many of the business in agriculture is conducted on a handshake, usually there isn't even paperwork. Also, in many countries agriculture relies on operating in a grey area of tax, just to break a profit. I bet a lot of farmers don't even want a transparent ledger for everyone to see ;-)
Most of us developers don't have the real-life industry experience to remotely understand where the opportunities lie. And we are way to naive to think that we can solve any problem with software.
I've been a farmer for most of my life in conjunction with my day job, and this is correct.
There was an article recently in a modern farming magazine about how restaurants and downstream consumer organizations and businesses want to use blockchain technology to track where a cow was raised, where it was processed and how, and when it became the hamburger or whatever you're eating so that customers can scan it and feel better about themselves and their choices.
Cool. Super cool technology. That I will never use on my farm because it provides me with absolutely 0 incentive at the start of the stream other than increased overhead. After looking into the technology, there are ways to use that data to track cattle, and ways to use that data to track yield and feed ratios and that sort of thing, but it's not the purpose, and therefore almost impossible to get out of the system. It also would be super easy to game by the large 'family owned' farms (sort of like how they do it now anyway).
And I already do what I need with excel, it's just time consuming. So it's another example of 'BUT IT'S TECHNOLOGY' that just doesn't appeal to me.
For other examples see: automated weed control, most automated irrigation systems, most automated weather and climate tracking systems, and definitely any/all automated feed and livestock management systems.
The one thing you said that I want to push back - from a very US centered perspective - very little farming is done via handshake anymore. Any farmer who is actually able to survive does so by playing the markets, by pushing formal contracts, and by being smarter than that. Nothing on the outgoing side is left to chance anymore, because all of our inputs are based on chance already (weather, performance, yield).
Yes, for example in warehouse management. Most companies still use outdates apps on Windows ME.
The problem is that stakes are so high that nobody dares to change.
You cannot turn off a running warehouse for some days.
I believe datamigration is mostly where the opportunities are. Because then you can one day flip the switch and start working with new software while having the same 'state' of the old software.
I work at a company that intersects finance and healthcare.
Our competitors to dominance through smart acquisitions, but this came at a fixed linear or super-linear operating cost per unit of growth. We are growing by identifying abstractions, and incorporating the new markets into existing ones.
Their method was the best way to quickly grow a company to a national scale in the 60s, but it comes at the monotonically-increasing costs of internal coordination and duplicate work. Software allows you to grow gracefully, while avoiding those problems as they appear.
As an example, we analyze inbound calls by volume every quarter to identify how we can anticipate user needs, and solve them ahead of time, without requiring a phone call (thereby improving the client's UX and saving operating cost). Every year, we reduce our call volume while simultaneously increasing our client headcount and improving the tools that are at our team's disposal. That kind of iterative optimization isn't possible at the big conglomerates.
Look for industries that haven't figured this out yet, those are the ones where the money will be made; Stripe is a good example of this playing out in finance.
thank you for your explanation! I would love to hear more about your approach. I am looking what to do next in my professional life, and these kind of concepts sound very exciting!
Would you perhaps be willing to have a short conference call, or perhaps, an email exchange?
I can’t speak for this person but it appears they’re making the case that a lot of value comes from giving some computer love to industries that gets no love but could benefit from a little tlc.
Damn so it's basically luck...I mean even bitcoin was 10 years ago I think? Also nowadays you probably need to be a PHD to work on something really interesting.
Not necessarily luck. Just go to the professional forums of the people in those "cutting-edge to the masses" fields, and see what they're nerding out over as a "cool thing they'd like to get into when it gets a bit more mature."
(It works for programming languages, too. Hang out in a forum for a relatively-new language like, say, Elixir, and you'll hear gossip about a lot of languages whose communities would describe Elixir as a staid legacy language, that'd never merit a mention here.)
I don't see how you could actually come up with new stuff in PLT without a very heavy math background, specifically in logic. I mean it's not impossible just extremely unlikely.
New programming languages aren’t usually examples of novel advances in type theory. They’re usually just novel combinations of existing features. They’re still “bleeding edge” in the sense that they’re what the industry will be using in 20 years.
Makes sense. But computers have a lot of interesting applications so maybe something will come out. For example gaming in the beginning, in 70s and 80s still attracted a lot of teenagers.
That's clever. Nobody can draw dark in AR, which means overlays on the world have to be too bright and shiny. Ellsworth came up with a use case where the AR images always have a grey background, the blank game board. So this is much more workable than the more elaborate AR schemes. Cheaper, too, at $299.
Quantum information science seems like something that will take many many years before it amounts to something.
I remember reading that a quantum revolution in computing was just around the corner 12 years (in 2007), and only a month ago has any sort of "breakthrough" happened (referring to Google's "quantum supremacy/breakthrough[0]").
Considering this glacial track record, I'm very skeptical.
This is becoming less true. The field is still very PhD heavy, but some employers definitely hire people right out of an undergraduate Physics or ECE degree. They end up effectively completing a PhD program on the job, but don’t get the fancy title. On the bright side, they get paid probably 3-4x more than the average grad student stipend.
I think you’ll start to see more demand for non-PhDs as the various competing QIS players begin to standardize their architectures and practices.
During the Iraq War, everyone marveled at the great new technology behind the Predator drone to conduct surveillance and deliver ordnance during strike operations. Then it leaked that the Predator had been developed 15 years earlier and used successfully in Desert Storm. Cut to 2011 when we learned that the SEAL team that killed Osama Bin Laden used a stealth helicopter, something that no one in the world had any public success toward achieving, and we not only achieved it but were running classified missions with it.
Based on those two examples alone, I estimate that the most secretive programs are 15 years more technologically advanced than we are in the public ___domain.
AI and Quantum Computing come to mind as likely focus areas in addition to those already mentioned.
There is also a great deal of fetishizing of these black programs and post-hoc creation myths that assume a level of advancement that simply does not exist and did not exist at the time. As a simple example, you are confusing the Pioneer drone (developed in the late 80s, deployed in Desert Storm, basically an RC plane with a video camera) with the Predator. The Predator was not ever used in Desert Storm and in fact the program was only started just prior to that conflict. Given that drones have existed since the 30s and the initial base that was used to develop the Predator started as a cruise missile, which have been widely known since the early 80s, I am not quite sure why you think drone development was a surprise to anyone other than the casual, ill-informed public.
Yes. The British Army was actively using drones from 1999 [0]. The unarmed Phoenix system was developed to support artillery forward observers, and used in the Balkan wars and Gulf War 1. It was operated by the artillery regiments, not TLA hoods.
> AI and Quantum Computing come to mind as likely focus areas in addition to those already mentioned.
Um, yeah, no. Pretty much all experimental QC students capable of doing the work are accounted for. The theorists who don't hustle mostly can't get jobs, with good reason: it's mostly malarkey.
You'd think if there were some blackops AI center of genius out there, at least one of them would have a job at Facebook making $2m a year or at a hedge fund gobbling up the order books. There are none.
Neither the predator nor stealth choppers were particularly surprising. You can watch documentaries on youtube about stealth choppers like the RA-66 Comanche, and drones were in wide, boasted-of use in Vietnam.
FWIIW I did a little consulting work for SBIR places; most of the procurement officers got their ideas from shitty science fiction. I'm pretty sure we'll eventually get all the dumb things from Aliens no matter how impractical and useless in actual combat they are.
> You'd think if there were some blackops AI center of genius out there, at least one of them would have a job at Facebook making $2m a year or at a hedge fund gobbling up the order books. There are none.
Perhaps they were TOO successful, and their AI escaped, eating the researchers on its way out. There's no way for us to know..
To implement a prototype of the future, this works well, but to sell something to consumers you have to implement something "cheap", which Xerox PARC didn't manage to do..
And yet their innovations paved the way towards HCI paradigms we take for granted today. So although they weren't practical when they were invented, I think there's still value to these sorts of labs to continue paving the way for the practical creators of of 2034.
That's the way I (try to) think, sort of, but honestly the successful projects are the ones with money behind them. Java was just terrible but IIRC Sun put in approx. $0.5 billion into advertising it and it was jumped all over by execs because of advertising and not any technical, or far more importantly business, merit.
Tech is fad driven - look at all the big data projects by companies that don't need them.
Java was released in 1995. Compared to the state of the art back then, I believe it was a significant advancement. There was plain C with all its shortcomings, C++, but in a much worse state than today, and then the dynamic bunch (Python, Tcl/Tk?).
There's no doubt Java had major issues, many of which were only ameliorated with Java 2 and later, and some remain today. But on the other hand it was memory safe, statically typed, with much simpler semantics, machine independence, Unicode support etc. pp. I believe Java overall meant progress.
I'm not sure the advertising budget perspective is that salient. The main competitor for Sun at the time was Microsoft (with C++ and their Windows platform), who had a very well working marketing / developer relations / etc operation going as well.
I'm not so sure, you're comparing it only to C,C++ or Tcl/Tk but what about Ada? Smalltalk? Lisp? Objective-C?
Plus Java's standard library was full of bugs for a looonnng time: they focused on new features instead of fixing it..
I've experienced "Java can't print" myself while Java's hype was going full force..
Objective-C was going nowhere before Apple's acquisition, my thesis was rewriting a particle system from Objective-C/NeXT into C++/Windows as means for the department to keep research in OSes that everyone else could use.
Ada compilers were bloody expensive.
Modula-2, Delphi, VB, Oberon, Modula-3,..., were great, but they were either only available commercially, or OS specific.
Java was given away as free beer, with a batteries included standard library and with less hurdles than trying to write portable C or pre-C++98 across all platforms that existed in the 90's.
That alone made us try it out,
On my university, the computing department changed all the compiler design and distributed computing classes to Java, without Sun paying anything to a tiny Portuguese university.
The java performance complaints are also widely overstated...Mostly when running on windows it took a few seconds when starting a java program if the JVM wasn't yet loaded to memory (as it had to be loaded first).
But a well written java server-side application (so something that doesn't need to be started very often but generally just idles until it's contacted by clients) is as fast or faster than anything you were likely to be using for a couple decades after its introduction.
Ada, until Gnat came out, was similar to C++, minus templates and "real" object orientation but plus generics and threading. It was also hand crafted by military contractors out of pure unobtanium. The true interesting bit, Spark, didn't come until later. (I'm not sure who did the Ada I used in a class in the late '80s.)
Smalltalk was neat, but never caught on, possibly due to the lack of free implementations and the difficulty of integrating with anything else. (The original prototype of the OS/2 3.0/Warp UI was done in Digi-something Smalltalk. (Waves feebly from IBM.))
Lisp, in particular Common Lisp, was very popular in certain areas, but again died out. Possibly another combination of lack of free implementations, integration difficulty, and sheer difference from the alternatives. (Scheme. Sigh. The cleanest language design ever. But no single OO design when that hype train was at full speed. Plus, parentheses, which well Common Lisp had already poisoned.)
Guy Steele said that Java dragged the industry halfway to Lisp, if that means anything.
Java was dire. It could have been much better when released - though granted it was designed for embedded originally, but it was then arguably mis-sold.
Performance was avoidably bad in some parts of the libraries.
I recall a major rewrite of some product being started then killed because performance sucked (surely much better now, but things took far too long to change).
Memory behaviour was not fixed until java 5:
"The original Java memory model, developed in 1995, was widely perceived as broken, preventing many runtime optimizations and not providing strong enough guarantees for code safety"
I'd also like to add Brinch Hansen's heartfelt lament
"If programmers no longer see the need for interference control [in java's threading] then I have apparently wasted my most creative years developing rigorous concepts which have now been compromised or abandoned by programmers"
Per Brinch Hansen, Java's Insecure Parallelism - SIGPLAN Notices, April 1999
> statically typed
with type holes for arrays
> with much simpler semantics
yep, too simple. IMO it relied too much on the programmer. Generics took too long to appear also.
I think it never had Eiffel's level of tooling, and libraries, focused on UNIX platforms as CLI compiler, hence why it never took off outside its research university.
I learned about it on a Dr. Dobbs article and that was about it.
As for performance, Eiffel was already quite fast, because while it used a VM for development (MELT), AOT compilation was done via system compilers, with support for generics and value types from the beginning.
It was also one of the first OOP languages to support non-nullable references.
I subcontracted to a guy who was sold on the eiffel hype. I got an installed it to. I managed to crash the compiler in the first few minutes of use. The other guy ended up entirely distrusting the incremental compilation after losing a day or so to that.
I don't remember any particular tooling, and the language was crap (Meyer was better at hype than language/compiler design).
IIRC sather was eiffel with the subtyping holes fixed. Bertrand Meyer allowed holes in eiffel subtyping (which sather disallowed) which he (meyer) was sure he could deal with, but eventually they proved intractable IIRC.
My boss eventually abandoned eiffel, and let the eiffel company know why. It was not an enjoyable experience for either me nor him.
They’re probably advanced in achieving some strange specific technological goal like making rotor blades sound like goats. But they probably do it with floppy drives.
Multifrequency, spread-spectrum, unjammable, secure communications that sound like an ancient POTS line over a sub-oceanic cable and have the connection resilience of two cans and a piece of string.
15 years ahead is a good rule of thumb for DoD / Aerospace. 3D printing and additive manufacturing used to carry monikers like Rapid Prototyping and Stereolithography but has been in use for decades. Another metric is cost, which is typically many orders of magnitude ahead of the inevitible consumer tech version.
I love reading about space history. And the part of the article that hit home was the floor with 200+ engineers working as a team. That engineering led culture is giving way to financial dominance.
The Long-Forgotten Flight That Sent Boeing Off Course:
A company once driven by engineers became driven by finance
I was struck, though, by this quote: "my Electrical System Engineer Al Gross threw me a curve. He said, 'Bob, we need to connect two new circuits with a couple of diodes. I don't know where you are going to put them, but I don’t want to add them to the Recovery Programmer as this will trigger a requal program,' That would require a lot of work and paperwork to requalify the programmer."
I don't know enough about aerospace to really follow this (or even to know what a "programmer" is in this context), but it certainly sounds strikingly similar to Boeing's use of ad-hoc engineering solutions to avoid expensive retraining of pilots.
Perhaps technical corporate culture has always been a complicated relationship between finance and engineering. Triumphantly successful projects tend to make everyone forget about the tension, while miserable failures lead to more introspection.
Both bombing drones and stealth helicopters seem to be relatively straightforward concepts that were not publicly developed simply because outside of the military there is no motivation to pursue those things.
On the other hand AI and quantum computing are generally useful and require novel breakthroughs, so I would expect a smaller gap or no gap.
For my part, as no one special in particular, I've seen information and intelligence tools available to line unit leadership which were well ahead of their time. As well, I have seen technology at work in the field during missions which I did not recognize at the time (and at least one thing I still cannot explain but which certainly must have been some kind of weapon or recon platform). I also have family that works tangentially with skunkworks engineers who is quick to say exactly this, that the bleeding edge the defense contractors are working on is easily a decade or two ahead of the curve.
Sure. To sort of preface this here by ruining my credibility, I was once previously witness to an "actual" UFO, before I was in the military (in the company of several others including a police officer, however still unexplained). Anyway, the thing I was talking about was much less "that kind" of UFO than something with components I may be able to explain but which I did not then and still do not today completely recognize. What I and others in my (infantry) unit saw were vehicles with a kind of very low frequency pulsejet propulsion system in the night sky above a certain large Iraqi city, flying in from the south (ish) and making a big U-turn before flying from whence they came (ish). You'll have to trust me when I tell you these were yellowish flashes from some kind of fuel detonations and not flight lights, given the flashes would appear bright, expand quickly, and then fade out over 1-3 seconds while maintaining a static or near-static position in the sky. However unlike any pulsejet system I've ever heard of, these flashes would occur only about once every 5-10 seconds, albeit at very regular intervals. They were also completely silent. There were no other flight lights of any kind, and they only ever flew at night. Through nightvision, we could very briefly see the fuselage streaking forward after each explosion, but otherwise not visible. I suspect they were painted black, as were many other kinds of aircraft we were accustomed to seeing (including of course some otherwise completely normal-looking civilian-ish aircraft whose operators would rather everyone simply forgot existed). The object through thermal optics were almost exactly the same, with the explosions visible (as hot), and what must have been the fuselage briefly visible streaking away from the explosion (also hot, though nowhere near as hot as the explosions). I never saw these things drop or fire munitions. But also neither were they ever seen to have been fired at, which I mention only because that's what we first thought it may have been, however incredibly unlikely, until we noted the interval regularity and these sightings became a not-uncommon occurrence in the night sky. However I did once witness some "normal" jet aircraft (flashing white flight lights) vaguely shadow one of these objects through it's U-turn and continue on at a slower speed after it. Just to be clear: I can't be certain if the flashes were actually propelling the fuselage forward, so I don't mean to imply the "streaking" was any kind of discernible acceleration, but rather just was the only other visible component of this thing at any time.
when we learned that the SEAL team that killed Osama Bin Laden used a stealth helicopter, something that no one in the world had any public success toward achieving
It’s more interesting that it leaked out that they crashed one of said helicopters because they ignored the mission planning advice they’d been given. How embarrassing!
> ... because they ignored the mission planning advice they’d been given.
Who is "they"? (The pilots?) Which advice was ignored?
> The crash of the Blackhawk may have been, at least in part, caused by the aerodynamic deficiencies introduced to the airframe by the stealth technology add-ons.[321]
The most interesting thing for me is: how do you keep all those bright minds with enough knowledge and experience from striking out on their own either in academia or in business? At the end of the day creating new tech is hard, replicating 'easy'. Or is it that the "15 yr ahead" simultaneously suggests it's about 15 yrs of work on your own to replicate? Perhaps they work with great compartmentalization? But compartmentalization suggests you don't need great minds, but just very good minds with good structure. That goes against what I think I know about innovation. It's quite fun fantasizing about a world within a world with better technology.
You are ignoring a very important part of it - some people genuienly want to "help their country" and are willing to sacrifice personal success for this. For example it's not exactly a secret that 3-letter agencies don't pay software engineers anywhere near as well as other IT companies would, but the promise of doing something "for your country" is worth more than money(just like game developers sacrifice money to work on something they really like).
Since graduating from college with a EE degree, I've worked for 3 employers, all of which had or have very deep roots in the intel community. At my current firm, we've always had success by a large investment in new technology under IRaD. As one of our founders has said, we provide solutions to problems that our customers don't yet know they have. Payscales for engineers is consistent with commercial companies here in Silicon Valley. In my 40+ year career, I can honestly say that I've never sacrificed personal success for this job and the work that I've done has contributed, in a hugely important way, with our ability to collect and analyze data from the "bad guys". I have absolutely no regrets for my career path.
>For example it's not exactly a secret that 3-letter agencies don't pay software engineers anywhere near as well as other IT companies would, but the promise of doing something "for your country" is worth more than money(just like game developers sacrifice money to work on something they really like).
You hit the nail on the head in explaining this dynamic, but I hate the fact it exists in the first place, in game dev or IC. It also exists with US special operations, as another way to sort out anyone who isn't a true believer.
Fair, you might say. I just think it's dumb a Google/Netflix/Facebook engineer working on some boring cog in their company's infrastructure gets paid a top-of-market $250k+, while an engine programmer working on an AAA title makes half that, or a firmware engineer working on offensive cyber payloads makes half that, or a special operations solider makes half that.
Maybe it's just market demand: boring stuff is paid better since everyone wants to do something exciting. But I think it's also a filter, and a lot of these places know that.
On this subject, there is a very interesting new book about Jim Simmons, the founder of Renaissance Technologies, who used to work as a codebreaker in the US. I think for many of the people in these programs, the attraction is getting to solve problems beyond anyone else's reach, with tools beyond anyone else's reach.
> getting to solve problems beyond anyone else's reach
more this, and less "tools beyond anyone else's reach".
A fair number of people go to work with these groups to "check the box" that they've done it. They know who's at NSA, or CIA, or DARPA. Because they spent 2-5 years with them inside the box. There's a tremendous amount of self-confidence that comes out of a successful turn at one of these places. You know you know.
For the same reason people do research, that also "pays" a pittance compared to what those people could make in the market.
And actually, it at least pays better than the academy. Most of the time you will move to some smaller, cheaper town. Sometimes even living inside a base.
You can actually buy a house and have a family with defense wages. Try doing the same doing research on the academy.
knowing how to do some highly advanced tech thing and building a successful sustainable business are orthogonal skills. It is very common for somebody in one camp to overlook this.
Definitely with Quantum there seems to be a huge advantage to having it but convincing everyone that you don't. I think we can move towards quantum proof crypto, but it won't happen until we are convinced someone can spy on us if we don't.
I thought it was pretty widely accepted (if not outright known) that military applications, particularly hardware surrounding stealth or the ability to kill people, was about a decade ahead of anything comparable in the private sector. It's not surprising to imagine that gap widening over time as well.
Yes generally, there are people in the government using technology well more advanced than anything available to the general public. However it's only a handful of people that are using it. Much of the high cost comes from these systems being at such a small scale.
Also earlier this year when Trump tweeted a classified satellite picture of some Iranian factory and everyone marvelled at the details and quality...
This is not surprising, though, considering that the US are the most technologically and scientifically advanced country and that they have been outspending everyone else by orders of magnitude in military (and related domains) research for about 80 years.
in fact it crashed because it was hovering over the (walled) compound, and those walls were modifying the air flow and causing a vortex ring state. Most helicopters would have suffered the same fate.
Considering that some individual NRO launches have had budgetary costs of $900m to $1.1bn for a single satellite and launch vehicle, it would be fascinating to know what kind of technology they're dealing with today. Lots of rumors about KH-11s and Misty, not much concrete info, for obvious reasons (it's all TS and codeword and higher).
This must be one of the purest apples to pears comparison I've seen: comparing two similar looking things intended to do completely different things and declaring one superior to the other without any hesitation, surely this is the very ur-form of the expression?
What capabilities will those (supposedly) satellites have ? read the small handwriting on the shopping list I have made through the apartment window at night ?
Keep in mind the Airy disk limit 1.22λ/d. Unless you're doing optical interferometry between satellites, 100mm resolution from 100km altitude, 1 microradian, requires an aperture about 800'000 wavelengths across. For blue light, that's a 300-mm-wide telescope; doable, although getting the optics to be that good is a serious challenge, motion blur is another issue, and such small sensor pixels don't get many photons and can be noisy. Now if you want to get to 10mm resolution, you need a 3-meter aperture; 1mm, like you're suggesting, requires 30 meters. These are much more challenging to launch.
There are some interesting things you can do with indirect light, but so far they don't help with this.
I saw something once that explained how the powerful magnification seems silly until you account for the angles that things are at.
The example they used was that license plates are vertical so to read them the satellite can't be directly above the car so it's actually a much further distance and greater magnification is necessary because of the atmosphere and the distortion of reading something at an angle.
More that the satellites are in fixed orbits that are relatively low (250 km) and moving at thousands of km per second. Even if the target is directly below them, they need to rotate quite a lot to see it for more than a few seconds at a time. If the target isn't directly below, they need to look much farther sideways than down.
Slight typo there: "thousands of km per second", you probably meant "thousands of meters per second". Low orbit is on the order of 7-8 km/s. I did find the typo humorous.
The picture Trump tweeted was from a satellite with a resolution of at least 4 inches. They can definitely see your cell phone, but they can't read it- at least not from space.
That was one of the more interesting black programs. Some of them are really boring.
There are times when you get through all the security checks and see the thing, and see "they still have one of those running?"
The famous Blue Cube in Sunnyvale, the USAF's satellite control center, was running 1960s technology well into the late 1980s.
The famous Blue Cube in Sunnyvale, the USAF's satellite control center
I just got off westbound 237 at Mathilda an hour ago for the first time in years. It's so strange to see that entire side of the road leveled in the background.
Blue Cube: gone. Lockheed building 101: gone. Lockheed building 102: gone. Five years of my career without a single physical artifact left.
> "One of my early assignments was to lay out a printed circuit board. My electrical engineer gave me a schematic and identified each part type, be it a TO5 transistor, carbon resistor, tantalum foil capacitor, et cetera. My task was to fit all the components on a fixed size board and connect them according to the schematic with copper circuit runs on the back of the board."
This sounds like it would be an incredibly satisfying job, especially given that there was constant variation in what he had to build
Autorouting is still not generally used very much in PCB design. I suspect it could be used more than it currently is, but generally you spend a lot more time specifying design rules and the results are often quite poor.
>>> But what about those people in the monkey lab? They believed this spacecraft was for their payloads. And they will never find out they are just a cover for the CORONA program.
That as much as anything here seemed ... wasteful? cruel? Unlikely to fool anyone?
I got in a conversation with my brother in law's uncle a few years back, and found out he spent a lot of time "back when" working out how to de-orbit the old Skylab station. Calculating drag based on how it was oriented (and how to orient it) to encourage de-orbiting just so. Still kind of amazing to think about how he did that. It's like trying to line up just the right bowling ball shot on a 1000' long lane. Remotely.
> He told me that what I was about to learn, I could not discuss with anyone who was not also cleared for this information. I could not talk about this with family, friends, or other GE employees forever.
Yes. If you haven't been debriefed on the topic you haven't been told what parts of what you know are still public. I know a few ex-top secret people who will admit they worked with projects that we now know a lot of information on - but they refuse to say more than they worked on the project. They haven't been debriefed, so they won't even read a newpaper out loud if it touches their old project because they legally cannot talk about it even public information lest they leak something that hasn't been made public yet.
2019 in late 1950s was approximately "forever". Remember, Space Odyssey filmed 10 years later was taking place in 2001, already 18 years ago. We are that far in the future.
That has nothing to do with your comment, but regardless it absolutely doesn't. Just because other people may be allowed to talk about it doesn't mean that you are.
> I assume he has requested and been given permisson to talk freely about it.
I don't assume he has to be, and it's good that people who are that old aren't afraid to talk about what they did then.
It's 50 years ago. To compare, General Groves was the military leader of a whole Manhattan Project published a whole book with the comparable amount of detail not even 20 years after the project, in 1962 ''Now It Can Be Told'':
I love stories like this and have read several books around these topics, but always love to ask for recommendations when something like this gets posted in HNs. What is your favorite book around this topic?
"After my clearance was granted, I moved into the main drafting room with about sixty other draftsmen, designers, and lofts men. (A loftsman develops full scale patterns, or drawings, of complex shapes.)"
What does "develop full scale patterns, or drawings, of complex shapes" mean in the context that this is different from drawing blueprints like the draftsmen?
I believe loftsmen prepare full size drawings or patterns at 1 to 1 scale, whereas a blueprint may be at a reduced scale. I.e. a blueprint fits on a drafting table; the pattern created by a loftsman would go on a factory floor.
Now it is common to use a laser tracker to take measurements and create full-size drawings. Spacial Analyzer really changed the game. NNSY has over 90 API laser trackers that they use.
The patterns would be for things like copy milling or cutting out complex pieces of sheet metal. Blueprints are schematic representations of parts/assemblies.
If you liked this, here’s a talk by an engineer who worked on the camera systems of the next generation black
program, after corona, hexagon: https://youtu.be/GtmtYlcPYYA
Parallels to Silicon Valley are easy to find. It's a valid career path: if you want to progress fast, find a hot young area where there are no senior experts and work hard to keep up.