Will terrorists ever get their hands on nuclear weapons?

This is something that I have been thinking about lately. Do you think it's possible that terrorist organizations eventually will get their hands on nuclear weapons, and use it to commit terrorist attacks against the west? If they do, the destruction that will follow will make 9/11 pale in comparison. :/

Yeah 14
No 5
Maybe 8
Help us keep this site organized and clean. Thanks!
[ Report Post ]
Comments ( 19 )
  • Boojum

    It's not impossible. According to Wikipedia, there are more than 15,000 nuclear devices in the world (most of which are at least partially dismantled), and some of them will be in places where the local political situation isn't that stable and security systems could be better and less corrupt. Still, detonating a nuclear warhead isn't like in the movies; there's not a big red button with a countdown display next to it. Because they tend to make a bit of a mess when they go off, they're built with a lot of internal safeguards which mean that detonation will only occur if the device gets exactly the right inputs in exactly the right sequence.

    I suspect a dirty bomb - a conventional bomb with radioactive material in it which will be widely dispersed when it goes off - is far more likely. One of those would cause huge disruption because of the complex clean-up that would be required.

    I'm much more concerned about the potential for terrorist groups and rogue states to develop and use biological weapons. That's not a huge problem right now, but there have been huge strides forward in manipulating DNA over recent decades, and every technological innovation tends to become cheaper and more widely used over time.

    It's not beyond the bounds of possibility that there could be a day when a few fruitcakes of some particular religious or political flavour cook up something in their home yoghurt-maker that has an incubation time that's much, much longer than Covid, is far more communicable during that phase, and far more deadly once the disease progresses to the acute phase.

    Comment Hidden ( show )
      -
    • MissileExpert

      The virus you're talking about is already out there. I can neither confirm nor deny any additional speculation.

      Comment Hidden ( show )
    • S0UNDS_WEIRD

      Came here to say almost exactly this.

      Comment Hidden ( show )
  • Tommythecaty

    They already have, it’s just maybe they work with.....oh never mind.

    Comment Hidden ( show )
  • a-curious-bunny

    Probably shitty low yield ones yea. Didn't some teenager build a nuclear something for a science project a few years back?

    Comment Hidden ( show )
      -
    • Boojum

      https://en.wikipedia.org/wiki/David_Hahn

      Comment Hidden ( show )
        -
      • a-curious-bunny

        Thats pretty cool. Shame about the drugs he couods gone places

        Comment Hidden ( show )
  • S0UNDS_WEIRD

    I came to say almost exactly what Boojum already did.

    Unlike nuclear weapons, improvements on tools similar to CRISPR/Cas9 and beyond are eventually going to make it frighteningly easy for many people to artificially recreate heaps of deadly viruses, the makeup of which is public knowledge.

    Are you familiar with the Fermi paradox? It's the paradox regarding the fact that there should statistically be at least 100,000 civilizations in our galaxy alone, yet we see substantial evidence of exactly zero. It seems to imply civilizations always die before becoming spacefaring to an interstellar extent. Every single time.

    One proposed solution to the paradox is that this is the inevitable conclusion of technology: The point at which any given member of a species can wipe the entire species out.

    Comment Hidden ( show )
      -
    • MissileExpert

      Or, the point at which breaking your civilization's camouflage to the galaxy, invites hidden predatory civilizations to gobble you up.

      Comment Hidden ( show )
        -
      • dude_Jones

        Stephen Hawking issued stern warnings about alien contact. Here's a link.

        https://www.space.com/34184-stephen-hawking-afraid-alien-civilizations.html

        Comment Hidden ( show )
      • S0UNDS_WEIRD

        This is indeed another possibility and one that excludes the first or first few civilizations from the so-called "great filter" because they _are_ the great filter, nipping future competition in the bud.

        It's worth noting that even the scientists who warn of this possible solution generally believe it's extremely, extremely unlikely but simply don't want to take even the smallest chance.

        It's unlikely because any civilization sufficiently advanced so as to so easily exterminate another one can just as easily ward it off or simply prevent it from becoming equally powerful, similar to the US disallowing various nations to possess nukes.

        The main reason it's unlikely, however, is that there's simply nothing to fight about. The universe has more resources than any civilization could ever need and it's also incredibly likely that sufficiently advanced civilizations can simply convert energy to matter just as matter can be converted to energy. The mathematics works both ways.

        I feel the most likely answer to the paradox is that civilizations very quickly reach something known as the "technological singularity" and that after that point they're completely unrecognizable to us.

        Consider this. Humans went thousands of years with little progress at first but invented crude wooden planes in 1903 and most of the population felt it was a hoax until the evidence was overwhelming. It was that crazy. But only 66 years later they were walking on the moon. Things are already moving insanely fast now and beginning to reach an exponential curve. The thinking is that as soon as the first AI smarter than a human being is created, it will create its own predecessor even more quickly, because it's smarter. The next will do so even faster. Soon enough they're creating predecessors within hours, then minutes, then seconds.

        The point is that it's exceedingly likely that from the point that we create the first aforementioned AI, within 10 years or so we make what we would have traditionally thought of as billions, trillions, quadrillions of years of progress. It's exceedingly likely that nature doesn't produce much of a middle ground between roughly where we are and inconceivable god-like intelligences due to this inevitable singularity.

        Therefore it's likely they're out there but we're no more capable of understanding their presence than a microscopic organism is of imagining humans.

        Much less likely but certainly possible is that while the universe will certainly eventually be a clusterfuck of civilizations, we actually were the first. It's such a statistical unlikelihood but technically _someone_ has to be. We could even become the eventual potential predator, deciding whether or not to share the universe or snuff out anything that shows up shortly after they inevitably start emitting radio signals.

        Comment Hidden ( show )
  • sillygirl77

    Not worth your precious peace unless it happens

    Comment Hidden ( show )
  • RoseIsabella

    No bueno.

    Comment Hidden ( show )
  • LloydAsher

    Any group who detonates a nuclear bomb in a terrorist attack will be the most hunted human beings on planet earth. No goverment would protect them. Why? Because nuclear war would commence if they did.

    Comment Hidden ( show )
      -
    • 1WeirdGuy

      We are at war with suicide bombers. I dont think they would care about dying. They think theyre gonna get abunch of tight puss when thry die

      Comment Hidden ( show )
        -
      • LloydAsher

        Think bigger. Those who employ the suicide bombing are more of a threat that the suicide bombers themselves.

        Suicide bombings are resource efficient. 1 guy for numerous casualties, when you dont have access to drones that's the best you can do.

        Comment Hidden ( show )
          -
        • 1WeirdGuy

          Israel, Iran, and Saudi Arabia are behind most of it. They're funding those shitheads.

          Comment Hidden ( show )
  • JellyBeanBandit

    Maybe if they ever become easier and cheaper to produce.

    Comment Hidden ( show )
  • Somenormie

    Highly doubtful.

    Comment Hidden ( show )