In that recent post, I cited the industry's road map, from nanomaterials to nanocomponents to functional nanosystems, all the way to scaled, atomically precise, productive nanosystem array systems. The trope possibility comes from the word you did not see: replicator.
Replicator, of course, is the SFnal term for things that copy themselves -- for plot purposes, generally without limit. They come from an idea in K. Eric Drexler's Engines of Creation, which popularized the notion of nanotech. Drexler discussed nanomachines called assemblers that would (duh!) assemble other machines, including -- and here is where the madness begins -- machines like themselves.
Assemblers run amok might turn us all into ... more assemblers. This is the so-called "grey goo" scenario. In Greg Bear's Blood Music, the nanites replicate to take over their first human host, then darn near everyone, then pretty much the world.
Of course the notion of run-amok replication isn't limited to nanoscale. There are also (in fiction) clanking replicators -- like the self-replicating devices that cause endless problems in the Stargate TV franchise.
So why are replicators a trope? Because the self-assembling assembler is a biological analogy, a proof of concept, not anyone's proposed design. Not even Drexler's.
Consider: people build robots to (help) build cars. Our robots do not forage the countryside hunting for steel, plastic, and electric outlets, sometimes building more of themselves and sometimes building cars. Maybe we could make such robots. It seems waaaay too hard. Unnecessarily hard. Pointlessly hard.
But what if? The nanotech community has drawn up ethics guidelines, much as gengineers did a few decades ago (when people worried about superbugs running amok). Here's a snippet from my last year's Analog science article:
Among other topics, the guidelines address alternatives to autonomous, self-replicating assemblers. Should autonomous, self-replicating assemblers be developed anyway, the guidelines argue for layered security measures to counter accidental or purposeful release. A few examples of safeguards:
But could someone intentionally design replicators to survive in the wild? Again, sure -- given lots of money, because replicators are a tough problem. If it’s any consolation, there are probably much easier ways to do us harm.
- Replicators can be designed dependent -- say, for their fuel -- on chemicals unavailable in natural environments. This is like designing a genetically modified bacterium to starve outside the lab for lack of a rare trace element.
- Material-scavenging functions can be limited to specific materials not found in nature, or not found in living cells.
- Onboard programming can defend against software errors -- analogous to mutations in a bacterium -- such as with error-correcting codes.
Even replicating nanobots designed with a malicious intent may not prove fatal. We manage to share Earth with a lot of ever-mutating, mindlessly replicating bacteria.So nanotech? Real stuff. Nanotech in hard SF? Perfectly reasonable. But grey-goo end-of-the-world scenarios? Replicators disassembling us all? Not going to happen, my friend. Trope.