Humanity will invariably reach a state of utopia in at most 200 years

I see so many people nowadays complaining about how now is one of the worst periods in humanity's history, we are already doomed and will go extinct in at most 200 years, how the future is grim, etc. But I'm not buying any of this. In fact, I think that we live in, by a gigantic margin, the best era of humanity's history, and that we are actually this close to achieving practical utopia.

It's not hard to see why I think this way: for the first time in history, those of us living in developed countries (basically everyone on this site) don't have to worry about hunger (opposite, we can choose to eat whatever we desire, whenever we desire on-demand using delivery apps), dying from common diseases, being tortured, being oppressed, etc, and we also don't have to perform tedious work for our entire lives just to make ends meet. That's not even to mention that we have our entire lives to practice our hobbies and the amazing technologies which facilitate our lives even further and enrichen our experiences of entertainment 100-fold.

In addition, it is no big secret that progress has been speeding up exponentially since the dawn of mankind (or if it is to you, check out one of Kurzweil's graphs), and the fields of AI and reality simulation are among the fastest-growing fields there are. AI is already capable of basic creativity (e.g. Jukedeck) and generality (e.g. AlphaZero), and so it's only a matter of time before all of our jobs become automated - extrapolating from historical data, this is likely to happen in around ~100 or so years. And, as to reality simulation, parts of the human brain have already been fully recreated digitally and simulated, and physics, graphics, etc are close to being indistinguishable from reality even now. Of course, recreating an entire world, with billions of living people, digitally will be hard, but going by these same extrapolations, we will have enough power do so within the next ~150 years.

Now, once we are able to operate in such a virtual world - which, along with the real world, would be fully maintained by AI - we will be able to do anything that we want, and there would be physically be nothing that we could want that we would not get. I think that's pretty much the definition of utopia.

The only hindrance to such a vision are potential world-ending disasters; however, there is no such disaster that we are aware of that puts us at any degree of risk of extinction. Popular candidates are global warming, nuclear war, and AI uprising, but none of these have any chance of wiping humanity out completely. Global warming will at most slow down the inevitable, but there is physically no way that it can wipe out all of humanity; nuclear war is similar, as it is always possible to hide in shelters even in the most devastating scenarios, so absolute extinction is near-impossible; and AI uprising is very unlikely, as if AI is programmed to have not hurting humans as a top priority, and if it is properly checked that the code is running properly, AI will have no reason to "uprise", as killing humans would physically hurt it - the same way that we feel pain when doing things that jeopardise our lives (because we were "programmed" to feel pain through evolution).

Anyway, this seems to be an unpopular opinion in this day and age, so what do you guys think about the fate of humanity, and what are your reasons for it? Also, if you don't agree with me, where do you think my logic falters?

Humanity won't go extinct and won't reach utopia in the next 200-1000 years 9
Humanity will reach utopia in the foreseeable future (<200 years) 1
Humanity will go extinct in 200-1000 years 3
Humanity will go extinct in the foreseeable future (<200 years) 1
Humanity will reach utopia in 200-1000 years 2
Help us keep this site organized and clean. Thanks!
[ Report Post ]
Comments ( 53 )
  • olderdude-xx

    If you read old and ancient writings you will find that there have always been people claiming that mankind would reach utopia within a few hundred years. So far, they have been wrong for at least 3000 years.

    I also don't believe that you have even considered the societal problems that are independent of technology.

    Science and technology can only solve technical problems. They can not solve problems due to human nature... Elimination of jobs and greatly reducing starvation will not solve the human problems that have existed for thousands of years...

    I doubt that humans will go extent for many thousands of years (unless some alien species wipes us out). But, I find it probably that the world population in 100+ years might only be a Billion or so, and it might be 10-20 Billion as well.

    Comment Hidden ( show )
      -
    • SingleUse

      Your first point is more of a sleight-of-hand trick than an actual sound argument. While technically somewhat - though not fully - true, it's pretty clear that most of the utopias described historically were either ideological, and hence controversial (for example, I would bet most people would prefer today's society than Plato's "utopian" society), or religious, and hence not grounded in observation or logic, in nature. Of course, all of this is very different from my proposition, which is based exclusively on observation. In other words, there wasn't a single writing throughout history that wagered utopia was a few hundred years away that was based on verifiable data on the current rate of progress. Additionally, many "utopias" predicted by our ancestors have already been fulfilled or bettered, such as those depicted in early sci-fi novels and those relating to human rights and equality.

      I have considered the societal problems, and most of them will have a solution as well. Do you have any specific problems in mind?

      Also, why do you think world population will be that low? It's becoming increasingly likely that a cure for ageing will be developed in this century, and even if it isn't, consciousness simulation will essentially kill ageing anyway. What's your reasoning here?

      Comment Hidden ( show )
        -
      • olderdude-xx

        1st. I don't think your so called "objective data" is all that objective or even close to complete. I used to think many of the same thoughts as you espouse about 40 years ago. Life has taught me how ignorant I was.

        As far as your thinking that the societal problems have real solutions... Many millions (if not billions) of people have thought that since technology got to the point that it greatly improved life. Technological solutions do not work for societal issues. Societal issues stem from the fact that people are innately free to have different dreams, opinions, and thoughts along with the fact that some behaviors are "human" in nature. That's not something you can actually eliminate.

        As for future world population. Major disease (Covid-19 is extremely minor MERS killed about 1/3 of people who got it), technological failures, climate change, and many things could dramatically alter the world population either quickly or over time. Up or down are equally possible.

        Comment Hidden ( show )
          -
        • SingleUse

          Well, just saying "your data isn't objective" isn't that strong of an argument. I have real data from which falsifiable hypotheses can be made (and have been made in the past - almost all of them were correct). The data that I am referring to is, for example, computing power value (calculations per second per modern-day dollar) over time and key historical events over time. Both of these variables have followed an exponential trajectory long before Moore's Law was formulated (or was in effect). Feel free to explain what you feel is not objective about these data.

          You having the same thoughts as me 40 years ago isn't really convincing, either. If I had to guess, I'd say you probably had these thoughts not because they were logical but because they sounded cool. This is probably also why you feel that "life has [proven you wrong]", even if it hasn't - some of the things that you naively expected to happen didn't happen, so you decided to abandon the entire idea of exponential progress and/or a bright future for humanity altogether. Same reason why so many atheists become believers, really. But even if that wasn't the case for you, personal anecdotes don't constitute a strong argument either way.

          I still don't really see what societal issues you are talking about. Could you be more concrete? Since you didn't give me a specific example, let me disprove at least the statement that "technological solutions do not work for societal issues": solitude (not to be confused with "loneliness") was a societal issue that has all but been solved by the internet and smartphones. And, to address your last concern in the paragraph, I do appreciate that people are bound to have differing opinions and thoughts and interests, but: when you are in a world where anything that you desire you receive on-demand - this includes psychological preferences - the word "issue" itself becomes a practical impossibility (again, save for a few areas like exploration of real-world space, philosophical problems, and stuff like that).

          Interesting. I don't think this is a mainstream opinion, though. Diseases have existed all throughout history and were actually much deadlier in the past due to inferior medicine. Climate change may reduce the population significantly, I do agree - although this may be balanced out by natural population growth. I don't think it's likely that global population will drop 800% in the foreseeable future - once again, considering global catastrophes like a nuclear war don't take place.

          Comment Hidden ( show )
  • Clunk42

    That's not a utopia at all. That ends up with Wall-E situation. Everything you say is wrong. The world has only gotten worse and will only get worse as humanity "improves" itself.

    Comment Hidden ( show )
      -
    • Neesa

      That's what I thought, yet people insist things are way better.

      Comment Hidden ( show )
        -
      • SingleUse

        Because, if we speak objectively, they are. Practically the only ways in which the present is inferior to any point in the past are subjective. But, objectively speaking, now is by far the best time humanity has ever seen.

        Comment Hidden ( show )
    • SingleUse

      How is that not utopia? If you want, you can just carry on living like you were before, no one is stopping you. In this simulated world, anything that you want to do you can do.

      Also, how is the world getting worse? I am really curious. Do you really think dying from common diseases, hunger, and execution is better than the life that we are living in the modern world?

      Comment Hidden ( show )
        -
      • Mammal-lover

        That is the world we live in actually. For all the ease of life and happiness we have theres someday being miserable and suffering so we can have it.

        Comment Hidden ( show )
  • RoseIsabella

    Yes, but what will felines be doing in 200 years?

    I honestly couldn't care less about humanity in 200 years if more big cats go extinct!

    Comment Hidden ( show )
      -
    • bigbudchonga

      It depends. I know there's a bill being put forward to end ownership of them in the US, which would essentially shut down a breeding programme that contains more than twice the tigers left in the wild.

      The future doesn't look good for big cats, tbh. Ironically the people who are trying to help them are desecrating the species and conservation effort.

      Comment Hidden ( show )
        -
      • RoseIsabella

        I certainly wouldn't ever support a bill to outlaw the ownership of big cats. I think if the person takes care of a big cat properly then he, or she be able to own one.

        Comment Hidden ( show )
          -
        • bigbudchonga

          Me too. I think it needs to be strictly regulated though to make sure they're living in good conditions, but I'm really worried about what's going to happen to them. It seems like the conservation effort is going to take a massive blow. Idc if you saw tiger king, but there's this woman called Carol on it, and she tries to "save" big cats, but all she does it castrate them so they can't be bread, and she's trying to outlaw ownership in the US! I know it comes from a place where she's trying to help, but if her law passes, she'll have done more damage to the conservation effort than if poachers killed every single tiger left in the wild!

          Comment Hidden ( show )
            -
          • RoseIsabella

            Carol sounds like an awful cunt, no wonder that Tiger King guy wanted her dead!

            I've personally always wanted to get up close, and personal with a cougar. They used to keep a live cougar on the campus of the University of Houston. I think that Carol bitch sounds a like like those PETA fuckers.

            Comment Hidden ( show )
  • JellyBeanBandit

    Yeah, I've always thought people in general think too small and too short-sighted about the future. When people think about the future they basically just think of current people and lifestyles just with fancier gadgets. But the world will be turned upside-down with the possibilities technology will bring. All work would be done by robots, disease and aging will be cured, we could genetically engineer ourselves to be athletic and intelligent, we could create anything we want with molecular assemblers (including food), we could experience everything we want with VR.

    Also it is easy to discount the positive aspects of today's world because by rights they should exist. Racism/sexism, starvation, poverty, disease, no education, etc. are unacceptable now so it's not a great thing that they're gone, we just see it as normal. So we hardly notice the improvements that have been made. All we see are the injustices that are still around and think they'll never be solved.

    That being said, there is still danger of us turning into a dystopia. Right now the corporations practically run the world, in the future they will even more so, and so all of this might not come to pass if the corporations don't want it to. They'll no doubt see that if money becomes worthless, then they'll lose power, and so they might try to stop any technology that could lead to this utopian society.

    Plus there are some things that are much worse about today's world than in the past, and they will probably get worse, like the corporations monopolising everything we need, the corporations ruling the government, people becoming more stupid and controllable, government surveillance becoming easier, etc.

    So yeah, while I do think we will end up in the utopia rather than in the dystopia, there's certainly no guarantee.

    Comment Hidden ( show )
      -
    • SingleUse

      Exactly, survivorship bias at its most evident. Personally, I would be cautious with genetic engineering, as there is always the question of where to draw the line, but I agree that the possibilities will be endless. I would also mention that "VR" won't just be a fancy gaming experience - it will literally be a world indistinguishable from reality in every respect imaginable, and all of our senses will be replicated perfectly, because either our consciousness itself will be digitally recreated, or the appropriate areas of our brain will be appropriately stimulated. If such technology should exist, there would be no justifiable reason to stay in the real world instead of the virtual one, as every real-world benefit can be recreated there (I guess space-related prospects would be the only exception).

      Your second paragraph is what I've been trying desperately to explain below, but people just don't seem to agree or understand: bizarrely, they say that they'd rather starve for their entire life, occasionally get tortured by officials, live their entire life as a slave, and eventually die from a preventable disease than "be controlled by corporations". It seems like these people do notice the improvements that we have made, but because they don't want to give up their misanthropy, they decide to believe that they aren't actually improvements at all.

      Superficially, it might seem like there is a chance: if the governments don't do anything to keep up with the accelerating progress, what will end up happening is that the 0.0001% who run major companies will rule the world and live a practical utopia, while everyone else will be in poverty and struggling for survival. However, it just sounds highly implausible that, among all of the chaos that will result, all world governments will simply ignore the vast majority of human population - especially given that most first-world countries are democracies. More plausible is that UBI will eventually be introduced and be continually increased, until money becomes useless, and we'll be fine.

      I would also caution against viewing corporations as "evil". The people in charge of big companies aren't some megalomaniacs who value the dollar as more important than a human life; the reality is that a lot of them are regular people, just like us, and they won't cling on to their power at the expense of humanity like you fear they will. And, even if they do, there is always the government and UBI, so I don't see any real danger here. I reckon issues caused by, and associated with, global warming and pollution will remain our biggest hurdles to overcome in the next century, but even if we struggle to mitigate them, at worst our inevitable progression towards utopia will be set back a few decades.

      To be honest, I don't agree with some of the problems that you have listed. Corporations have always been monopolising, and, if anything, there now exist more measures to tackle this than in the past. Corporations having so much power is the only way that we can ensure economic growth, so it will always be a trade-off between progress and corporation dependence; I opt for the former. People are actually becoming more and more intelligent by the generation (see Flynn effect), and we are becoming more open-minded as well (likely a result of the internet making information more accessible). Government surveillance is indeed becoming easier, but, come on, I'd take that over all the things that you have listed in your second paragraph any day of the week.

      Imo, if we don't get it horribly wrong somehow - be it with weaponising AI, extreme negligence on the part of governments, or allowing an all-out nuclear war - there isn't much in the way of our advancement to utopia. Of course, we always need to be cautious, but I really think that we are, with more people being sceptical or concerned with global issues than ever.

      Comment Hidden ( show )
        -
      • JellyBeanBandit

        Yeah, I agree that we're much better off nowadays than we've ever been and I have total faith in the ability of future technology to create a utopia. I just worry (even if it is unlikely) that governments could become so totally controlled, corrupted and infected by corporate money that they'll be able to change whatever laws they want and get away with any rights violations they please, and there'll be no one powerful enough to stop them, even the UN. I do think nearly all of the largest corporations are evil tbh, and the people who run them. It's difficult for a large corporation to compete without saving some money by violating human rights or polluting the Earth, certainly most of the big corporations I can think of have done stuff like that. Also it has been shown that there's a much higher rate of psychopathy among successful business CEO's than there is among the general population.

        Combine all this with the fact that people are so complacent about it all. They realistically practically don't care what the corporations do as long as they still offer them their latest shiny new products, evident from the fact that some of the most evil corporations are also the most popular (McDonalds, Nestle, Walmart, Apple, etc.). People also practically want to be spied on, evident from the fact that they willingly use things like Alexa, Google, Facebook, etc. despite fully knowing that the corporations use those deliberately to spy on them.

        It's interesting though what you said about large corporations being needed for economic growth, and I've often thought whether they are absolutely necessary to fund the development of advanced technology. It might be impossible for a technological civilisation to develop without capitalism and large corporations, so they could be a necessary evil and a risk we have to take.

        Ok yeah, I guess I was being biased saying that people are getting dumber. I have heard that they are actually getting smarter alright, I just find it hard to believe. But I guess any examples I've experienced of them acting stupid are just anecdotal. I do worry they're still not smart enough though, they seem to be very easily persuaded and distracted by marketing and PR, and tricked by politicians rallying them all with bullshit patriotic speeches.

        Comment Hidden ( show )
          -
        • SingleUse

          I think it would require some REAL, HUGE mess-up on the part of many nations and political parties at the same time for such sort of thing to happen. Sure, it's not impossible - let's say people in most developed countries keep electing Nazis and megalomaniacs to power, who then subtly kill the democracies in their respective countries, and then do nothing about the growing gap between the rich and the poor, convincing the public that this is how it was supposed to be in the first place. I can such a scenario taking place, but do so in the same vein as I see a false vacuum bubble engulfing Earth, or a Solar-system-facing gamma-ray burst cooking our plane to Death. All of these scenarios are certainly possible, but neither of them is likely to happen.

          The view that big corporations are evil seems to be quite popular nowadays, but I don't think that it's justified. First of all, what is the worst thing that big corporations have done? Your answer to this question would probably be something like "they neglected our privacy". Is that really enough to merit a group of people who have done so much good to society a title of "evil"? Like Ivan the Terrible- or Stalin-level "evil"? I mean, okay, perhaps you may respect the privacy of your data more than others, but if that's the worst that their morals allow them to do, then there is little risk of the human race being put in jeopardy. In my opinion, politicians and religious leaders are much more dangerous - take people like Putin, Rouhani, and Jinping, for example - Jinping is not ashamed to execute people based on their religion, and nor does Rouhani, who also wants to return his country to the Middle Ages, and Putin can leave an entire country in a state of stagnation just for his own personal wealth and power. Imo, if we can avoid electing politicians such as these guys (although Jinping at least cares for the people of his country, so maybe exclude him from this list), we'll be just fine. The aims of corporations and consumers mostly align, anyway.

          Interestingly, I am one of those "complacent" people. I turn all the privacy settings to minimum so that I can have the most personalised experience and so that the corporations have more data to work with and improve their final product. I don't really care if some information about my location or web activity or microphone activity gets represented as a data point somewhere in Google's code if that means that I get a better experience. Honestly speaking, I don't really understand why people would worry about this, either: it's one thing if your personal information gets disclosed to the public, but in this case no one even looks at it, so why would you worry? Unless you have engaged in illegal activities or are a very famous person, no one will ever get to look at your private data. Perhaps you can explain this bit to me, because I am genuinely curious.

          To address your last point, acting stupid and being stupid are two completely different things. The smartest people have held the silliest beliefs, and their justifications for holding these beliefs were anything but rational (to give you an example, Bobby Fischer believed Jews to be an inferior race, and Newton believed in alchemy). So don't judge people's intelligence by their opinions. Even so, though, the current generation is much less susceptible to PR, scams, and propaganda than older generations. Belief in pseudoscience (including ideologies such as New Age, creationism, homoeopathy) is by far the lowest among the younger generations, and older generations are also much more likely to fall for a scam/fake news story than are younger generations. Of course, these areas are still problematic, but they are definitely less of an issue now than they were previously.

          Comment Hidden ( show )
            -
          • JellyBeanBandit

            Invading our privacy is nowhere near the worst thing corporations have done. Just off the top of my head:
            (1) McDonalds and other corporations have lobbied to lower and even abolish the already abysmal minimum wage, even though their CEO's get paid millions.
            (2) Food companies have deforested the Brazilian rainforests to produce beef and palm oil, even though it'd mean the death of all the wildlife, orangutans and indigenous tribespeople living there
            (3) Apple, Nike and lots of other companies have used Chinese child slave labour to make their products
            (4) Google and others have taken advantage of the tax system, setting up branches in other countries (tax havens), to avoid paying their fair share
            There are countless cases of all of the above done by hundreds of companies, I honestly don't know too much about it all because I get too angry to look into it.

            Well the principle should be more than enough for you to not want to let the corporations spy on you. They intially tried to do it without your knowledge or consent, they actively tried to secretly spy on us and steal our personal information, that's just vile. Nowadays it's more known about but that's despite their attempts to suppress it. They then sell our personal data to companies to profit from spying on us. We then have to suffer targeted advertising psychologically manipulating us into spending our money. These companies can end up building up a sophisticated psychological profile on us that, combined with future AI, could predict our interests and spending habits so well to subtely drain money from us throughout our lives. This isn't even to mention the illegal things they could do with it behind closed doors, like giving the government access to it all for more lenient tax breaks, or allowing health insurance companies to view our health-related queries.

            Comment Hidden ( show )
  • Mammal-lover

    I mean we are running out of farm able land and living land. The oceans only have so much fish and overfishing will only get worse. If your idea of utopia is paying 5 bucks for a can of tuna than by all means hope away.

    Comment Hidden ( show )
      -
    • SingleUse

      There are so many problems with this comment.

      First of all, did you not read the part where I said: "it's only a matter of time before all of our jobs become automated - extrapolating from historical data, this is likely to happen in around ~100 or so years"? This means there will be no money, and everything will be either free or distributed equally to everyone.

      Secondly, overfishing won't be much more of a problem in the future since world population will stabilise at around 10 billion, which is only a 25% increase from the current population.

      Thirdly, is overfishing REALLY your biggest concern with respect to the fate of humanity? Your measure of advancement of humanity is access to fish? I am not sure if this is a joke or something, but this sounds totally ridiculous. If anything, fishing is controversial in its own right, as fish are also able to feel pain, just like us humans. It would probably be a good thing if we all stopped fishing.

      Lastly, and most importantly, my entire post is based off the idea that, in less than 200 years, reality simulation will be possible. This means that, whatever product it is that you desire, you can get it in infinite amounts. This includes fish, but also yachts, houses, planets, etc. If you didn't bother to read my post past its title, why did you bother replying?

      Comment Hidden ( show )
        -
      • Mammal-lover

        Sounds like you desire a fantasy world that wont happen. We cant produce product out of thin air nor is over fising my biggest concern nor will the population stabilize at 10 billion. You are just avstrsighg up idiot if you think people will suddenly just stop wanting kids. Nor will everything ever be fully automated as someone who has worked a factory job and a manufacturing job you seminply cant input the data a robot would need for that. Theres to much to do. To many variables. Your a dreamer and I can respect that but this is reality were theres limited ground, limited fish, limited nutrients in soil. We are actually running low on sand. So those alive in about 80 years give or take are gonna have problems with that. Not to mention drinking water. Feed for animals unless you think everyone is gonna be vegetarian. Then theres still crops but heres the thing unless we start finding a way to purify human waste we will still be reliant on animals for fertilizer to add nutrients to the soil. We are constantly loosing farmland for housing. And forest for farm land. Guess what we need trees. Trees provide is with oxygen through a process called photosynthesis.

        Plus with the increase in your carbon footprint and all the pesticides we use we are killing all the bees. Bees pollinate and without pollination we dont have crops and growth. So in 200 years at a rate of growth you mention with full automation pollution will be at an all time high killing of insects and others that we actually rely on wich will invariable kill us all of in no time at all and most likely cause a huge world war over resources wich will eventually devolve into a civil war with the winners over the last of the resources till eventually we are all dead or we smarten up and basically put ourselves back in the stone age and the chance of our scientists figuring out a solution being very unlikely.

        So basically your dream is the end of all humanity if you add in the facts of human needs, animal needs the planets needs and well you know facts of life. Utopias dont exist for a reason. Because they can't. Besides the fact you cant have machine repairing machines. the amount of times machines have broken down at work and its something so stupid and little. It's not even funny.

        Comment Hidden ( show )
          -
        • SingleUse

          I am not really "dreaming" about anything here - I am just being objective and thinking critically. Thinking critically doesn't always mean thinking pessimistically, and instead sometimes means thinking ambitiously, like in this case.

          I never said that we can produce products of our thin air in real life, but, in a digitally simulated world, almost no computing power is required to simulate a single fish. And, yes, in such a world, resources, just like our wants, are infinite, and yes we can produce them out of thin air. That's what I mean by "utopia" - anything that you want, you get on-demand, and there is nothing that you can possibly want that won't get on-demand. As far as I'm aware, that's pretty much what the word "utopia" means - a world where all of everyone's needs are met.

          World population will actually stabilise at 10b or thereabouts due to the limitation of resources on Earth. Sure, there are many population projections out there, but all of them agree that population growth will either stop completely or slow down enormously this century and that it won't ever reach 15b. You can read more about this on Wikipedia's entry on "Projections of population growth".

          As for factory job automation, I seriously hope you don't still work there because otherwise I've got bad news for you... most factory and manufacturing jobs are already automated, with some factories being fully self-functional (the only human workers in such factories are maintenance workers). But it's not only manufacturing jobs that will be automated in the near future - ALL of the professions that currently exist will get automated within the next 100 years, including those of scientists, doctors, entrepreneurs, etc; it's only a matter of time. There is plenty of evidence for this: if we look at the computer processing power over the past 150 years, we can extrapolate with a very high degree of confidence that, in 15 years, the most powerful computer will have more processing power than the entire human race combined. But yes, I know, processing power isn't everything. How about the fact that the same programme (AlphaZero), which doesn't know anything about chess, go, and many other games other than the rules of these games, can teach itself to become far better than any professional human in every one of these games? And it's not just brute-force calculation, either - its entire expertise actually comes from intuition. With the current super-exponential rate of advancement, what possible reason can there be for programmes such as AlphaZero not becoming more general and eventually being able to perform every human job? In order for automation to suddenly stop, all progress needs to suddenly come to a standstill, which can only happen in the event of a global catastrophe. If that doesn't happen, there is no reason whatsoever to expect automation to stop, and if it doesn't and instead continues at its current rate, all human jobs will be automated within the next 100 years.

          Pollution? I agree it's not ideal, and it is certainly one of our very top priorities to reduce the effects of global warming, but it's just not going to make us extinct. Like with fish, I am not convinced that the lack of particular crops (as bees are only responsible for about 1/3 of the pollination of most crops) or honey will kill humanity. On the other hand, I know this sounds a bit "selfish" (probably not as selfish as torturing animals and then eating their flesh, though), but once we are in a simulation, we won't care about pollution or insects. Also, with money having gone extinct, politics will probably die as well. There simply wouldn't be a need for war when the same resources are available to everybody, and there is nothing than anybody can do it about this (as all jobs are now performed by robots, who will always be more efficient at their job than humans). Honestly, I don't think your end-of-the-world scenarios are realistic. Even in the worst possible scenarios that you describe (which aren't realistic to start with, as most bee species aren't even endangered or threatened), there is no way that every single person on Earth will go extinct.

          My dream is certainly not "the end of all humanity", and we are not even discussing what my dream is - we are discussing the future of humanity. Sure, there are loads of problems for us to solve (as there have always been), but given that we solve them, it's very difficult to imagine what will stop us from achieving the state that I described in the post. If we can agree that we won't go extinct in the next 200 years, then us not doing so will require some explanation, which I haven't seen as of yet.

          Utopias don't exist for a reason - that's because they do. We live in our ancestors' utopia already. You see, everyone's definition of "utopia" is different, so, as long as the world isn't objectively perfect according to everyone, it's easy to argue that the contemporary world isn't a utopian world. But that doesn't mean that our contemporary world wasn't considered utopian in some other contemporary world - I imagine many people in the Middle Ages would think that it can't get much better than having access to free education, having an entire lifetime just to do what they love and nothing more, with all the tedious tasks being done for them, being able to travel physically to any part of the world in less than a day, and having the collective knowledge of the entire human race always available on-demand. But, okay, even if the modern world is someone's utopia, it's only a subjective utopia. The reason why an objective utopia doesn't exist yet is that we aren't quite there yet, but we are getting there at a rapid pace, and are already very close.

          Finally, on machine self-reparation: note that modern machines don't have the ability to fix themselves simply because they aren't general enough to do that. However, when AI exceeds both the generality and the intelligence of humans, there will be absolutely nothing that a human will be able to do that a machine won't be able to do more reliably and efficiently. Yep, that includes repairing and maintaining robots.

          Comment Hidden ( show )
            -
          • Mammal-lover

            What you are imagining is Luke digigstruct in borderlands that's not how it works in real life. You cant make something out of nothing. Plus you forget people thought we would have flying cars by this point.

            You simply cant do construction with robots.nor manufacturing. And yea I still work it. I make just shy of 20$ an hour in a state who's minimum wage is 7.25 wich is what manh places pay. To not work for more than double is idiotic when you can. You really have clearly never worked manufacturing then.

            Machines cant quality check they just cant. Nor can they communicate effectively in such a location. Communication being essential. Theres so many variables in manufacturing that you can easily have the easiest day if your life or the hardest with a snap of a finger. Something anyone who's worked such a job knows. uNless we somehow get the robots from that will Smith movie with an intelligence and capality like that it's not gonna happen.

            Wich creating a bipedal robot is far 2 difficult and anything with more than 2 legs is gonna be 2 damn big for manufacturing.

            Just look at my job. Something goes across the belt it might fly off, get tuck or go through perfectly. The next item of the exact same cut slightly diffrent weight plays those same dice. I've seen hundreds of them go through perfectly than it gets stuck for no reason. The cost alone to program and create something that could replace humans would cost more than anything a human could ever cost. It's not happening. Not in 100 not in a 1000 it's far 2 difficult and costly. Not to mention unethical. Even doctors cant be robots. This isnt some universe were you can just inject yourself with something and watch your wounds close. You need a dock. Wounds vary in far too many regards both internally and externally.

            Machines are good and very helpful but they are nothing more than a tool. They will never replace humans.

            Comment Hidden ( show )
              -
            • SingleUse

              I don't think you understand what I'm saying. Imagine a video game. Do you think it's possible to write a bit of code that loops a particular fish in and out of existence ad Infinitum? Well, reality simulation is basically a video game, except it's so advanced that it's impossible to tell it apart from the real world. Do you think it's possible to programme such a virtual world to have the resources that the people inside it desire?

              I think you should say this to factories which are fully, 100% autonomous (https://www.siliconrepublic.com/machines/automated-factories-video). Construction is a bit more complicated than manufacturing, although construction robot prototypes already exist and will replace construction workers in the near future (https://www.robotics.org/blog-article.cfm/Construction-Robots-Will-Change-the-Industry-Forever/93). You are right in saying that I have never worked in manufacturing, but I do have access to the internet, which tells me that some factories are fully autonomous, which will be true regardless of whether I or you have ever worked in manufacturing.

              Indeed, machines can perform quality checks, sometimes better than humans (https://www.sheltonvision.co.uk/news/automated-quality-control-vs-manual-inspection/). Some products are too complex for machines to master currently, but, as is the case with every other human job in existence, this will change in the not-so-distant future. Also, even if you believe that mastering your job requires human-level intelligence, human-level intelligence will itself be achieved in ~100 years' time. Sooner or later, even the most sophisticated jobs will get automated.

              Mobile bipedal robots already exist. See Atlas from Boston Dynamics, who I'm sure you'll have heard of or seen, as one example.

              You are thinking linearly. To you, the next 1000 years will see 1000 years' worth of modern-day progress. Very counterintuitively, they will actually see more like 10,000,000 years' worth of modern-day progress. What this also means is that some things which you see as totally outrageous and unrealistic, and which are currently far too expensive to impossible to manufacture, like robots which can replace your job, are actually likely to become reality in the near future. When I say "in the next 100 years", you should really read that as "in the next 20,000 years", as that would be a more accurate representation of the amount of progress that we will make in this time period. And I think you would agree with me that it isn't that far-fetched to say that AI may become advanced enough and cheap enough to be a viable replacement for humans in your field.

              Believe or not, robots have been better at diagnosing diseases than humans for over 30 years. Now, diagnosis is only a small part of the field of medicine, but, as AI becomes more precise and general, it will be able to cure the diseases that it diagnoses as well. Actually, doctors (apart from perhaps psychiatrists and other types of doctors whose field of expertise is based around social interaction, or in which social interaction plays an important part) are likely to be fully replaced by robots in the next 30 or so years.

              Comment Hidden ( show )
      • Clunk42

        So you think that technological improvements will somehow allow communism to be successful? I highly doubt that.

        Comment Hidden ( show )
          -
        • SingleUse

          And is there any particular reason why you doubt that? When all jobs become automated, capitalism won't work by definition - "a system where industry is controlled by PRIVATE owners" - because there are no private owners, aka workers. Also, I am sorry to upset your capitalistic feelings, but UBI will have to be introduced pretty damn soon (15 to 20 years maximum), or else inequality will rise to the point where the majority of the population are in poverty. Capitalism is an efficient system of driving economic growth and progress but only so far as only humans are involved. When robots join the game, the situation becomes very different.

          Comment Hidden ( show )
    • litelander8

      Right? A burger is cheaper than a bottled water.

      Comment Hidden ( show )
        -
      • Mammal-lover

        It's kinda funny how that's true.

        Comment Hidden ( show )
  • litelander8

    no

    Comment Hidden ( show )
  • --

    đź’¤

    Comment Hidden ( show )
  • KholatKhult

    Not if I have anything to say about it

    Comment Hidden ( show )
      -
    • BleedingPain

      What would be they most stereotypical way for you to do it? Riding a bear, guns a blazing? (Or is that too murrican’)

      Comment Hidden ( show )
        -
      • KholatKhult

        The unemployment rate among the bear population is atrocious these days.
        And we follow an old Russian proverb here;
        “Tank solve problem good”

        Comment Hidden ( show )
  • Tommythecaty

    Now is not one of the worst periods in human history, not even close.

    200 hundred years from now the world will be a police state and everybody will have Aspergers.

    Comment Hidden ( show )
      -
    • SingleUse

      Agreed with the first sentence. On the contrary, now is by far the best period in human history, not even close.

      Your second sentence was probably a joke so I won't dwell on it. But just remember high-functioning Aspergers are actually smarter than the general population.

      Comment Hidden ( show )
        -
      • Tommythecaty

        Not at all, many are only good in one area and terrible at everything else, It’s a bit of a myth. There are just as many high iq people who’re non autistic, and as such, without the disadvantages.

        They will on the other hand fit into the police state better with all the structure and rules.

        Comment Hidden ( show )
          -
        • SingleUse

          Re-read my comment. All I said is that people with Aspergers who don't have an accompanying intellectual disorder are slightly smarter than the general populace. That's it.

          Asperger's people are also more likely to be nonconformist due to their deviant nature, meaning they would actually fit worse into a police state.

          None of this is even to mention the fact that Asperger's syndrome will never spread to the general population because it poses a very obvious evolutionary disadvantage (Aspergers are less likely to reproduce), or the fact that there is absolutely nothing to suggest that the world will be a police state in the future - the trends actually tell the opposite story in this respect, with the world becoming ever more libertarian.

          Comment Hidden ( show )
            -
          • Tommythecaty

            Yeah and the non Aspergic people with high iqs are smarter than the general populous...so it’s not much of a point.

            Aspergic people are the least devious people by far (including the ones I know.) Their behaviour doesn’t deviate from the norm anywhere near the way cluster b types do. They seem to think anyone acting out of school is somewhat dangerous and intimidating because of their poor ability to read other people.

            The Aspergic people I know are rigid, straight forward, afraid of breaking laws and rules and dislike it when I or others break laws or rules. They are almost painfully boring.

            Comment Hidden ( show )
  • NoLifer

    I think Corona through a wrench in your plans buddy

    Comment Hidden ( show )
      -
    • SingleUse

      Coronavirus is not even the worst pandemic in this century - not even close. It's also only the 4th most devastating thing to happen to us since 110 years ago - after the two world wars and the Spanish flu. That's not to mention the entire history of humanity, where it probably wouldn't make the top 100. We are barely going to notice the effects of this pandemic in the long run - the amount by which it will slow down our progression to utopia falls well, well within the margin of error of my prediction (I reckon it could slow it down by a year or so).

      Comment Hidden ( show )
  • Ellenna

    Yeah, right, a utopia where people are so well educated they know the difference between invariable and inevitable.

    Comment Hidden ( show )
      -
    • SingleUse

      I thought "invariably" means "no other variants", i.e. "no other options"? Either way, I didn't claim to be educated, and nor is English my first language. Don't really see the need in this comment.

      Comment Hidden ( show )
  • bigbudchonga

    I can't think of a first world country that's able to sustain even a replacement level birth rate. Most first world countries are currently undergoing dysgenics, and women's happiness has been falling for the last 50 years.

    The West/ most first world countries are dying, dude, and the big power that's likely to take its place, China, is super authoritarian and also can't even sustain a replacement level birth rate. If trends continue we're fucked. The only places that are actually doing objectively better as far sustaining themselves, as well as improving quality of life, are places like third world countries, which it really isn't that difficult to progress since their current station is very low.

    Comment Hidden ( show )
      -
    • SingleUse

      "Sustaining itself" isn't everything. Basically all first-world countries will still be around in 100 years, and they will still have enough population to remain economic powerhouses right until full automation. Also, more people only means even faster progress, so third-world countries growing in population will only speed up our journey to utopia. China being authoritarian isn't a problem, either, as politics will become extinct when all human jobs are automated. Too bad for Chinese citizens (although I'm sure they would prefer living in an authoritarian first-world country than in a libertarian third-world country), but everyone else won't be affected.

      Comment Hidden ( show )
        -
      • bigbudchonga

        Sustainability is the foundation of any system. If current trends continue then not all first world countries will have the population to remain economy powerhouses; if trends continue then the average IQ will fall to a level that is incapable of producing great technological advances

        Why would politics become extinct? People still need to survive, they will still need protection and certain functions of a ruling body. War has always been around and always will be around; all you would need is for one nation to hold to war and the system falls apart. Also, I can't see complete, full, automation ever really happening; there are far too many variables. We can't even produce a piece of fully automated technology capable of keeping the streets clean or picking up even our household rubbish for us.

        Is there even a single piece of fully automated technology around in the world that's capable of sustaining itself?

        Even if it was possible to produce fully automated technology for every conceivable occurrence that may arise for man's life and satisfaction then you're dealing with a literal universe of variables which could upset it, many of which won't even be known at the time of creating it.

        Comment Hidden ( show )
          -
        • SingleUse

          I am not sure what trends you are referring to, but if we go by economic growth, leading economies will remain leading economies in the future - just the margin between them and other countries will become smaller, but that's a good thing because that only means that the world is becoming more developed as a whole. You see, human civilisation isn't a zero-sum game. Other countries becoming more advanced doesn't mean already developed nations somehow lose their level of development. And their native population may decrease by 5%-10% in the next few decades to a century, but that isn't near enough to hinder their economy.

          Also, current trends actually show a steady increase in global IQ scores, and less developed countries growing in population won't change that: as underdeveloped countries grow in size, they also become more developed in the process, granting them with better nutrition, which is a significant contributing factor to intelligence. While genetic factors may also play a role in intelligence, factors such as nutrition play a much bigger role (as demonstrated comprehensively by the Flynn effect).

          Politics won't be needed when money is extinct. When full automation is reached, the same resources will be available to everybody, so there won't be any need for war - every human on Earth will already have the most that he can possibly have. And, if we take that one step further into simulated reality, it would be impossible for someone to want something and not have it - all their wants can be satisfied instantaneously through code (which will probably be done automatically through mind-reading). So war won't really achieve anything either way.

          I think you are being too close-minded here. Robot-janitors are already a thing (https://www.cleanlink.com/news/article/Amazon-Explores-Replacing-Janitors-With-Robots--20932). The reason why we don't see these guys taking care of our roads is that doing so requires a high degree of navigation autonomy, as well as scaling of obstacles such as stairs. However, both of these fields are seeing rapid development (e.g. autonomous cars and Spot Mini robot) and will probably reach full automation within the next 10-15 years, possibly becoming mainstream in 20 years. Robots picking up household rubbish face the same problems. But, let me tell you: when these problems are addressed, it's not just cleaning that will be automated - our entire household will be automated, from cooking, to serving food, to buying food, to laundry, to clothes ironing etc. In addition, cars and transport will be fully autonomous as well.

          Regarding full automation, you can read my reply to Mammal Lover. Full automation is pretty much inevitable and will in all likelihood happen this century or early next century. All the data points in this direction. I understand that this is tough to imagine and seems very far-fetched, but the reason that it does is that our intuition tends to extrapolate linearly, while progress is not even exponential - it's super-exponential. This means that the next 100 years will actually see 20,000 years' worth of modern-day progress. If even this thought fails your intuition, think of this: what would people in the 70s think of you if you told that, 40 years later, they would be able to access the sum of human knowledge just by asking the question they wanted answered, or order any product on Earth just by asking for it, that they would carry supercomputers many times more powerful than the most powerful contemporary supercomputer in their pockets, that computer games would have open worlds the size of cities with graphics that are close to indistinguishable from real life, etc? That's right, they would probably tell you what you are telling me now: none of this will ever happen, not in 40 years, and not in 200 years. And yet here we are. Think about it - the speed of technological progress is scary, for better or for worse.

          Yes, there are indeed fully self-sustaining automated systems (https://www.popsci.com/technology/article/2010-07/diarrhea-bot-sporting-artificial-gut-eats-excretes-all-itself/) out there right now. Sure, they are currently very primitive, because self-sustainment is very hard, but this will be standard when AI becomes more advanced and more general.

          I have actually been thinking about the dangers of living in a digital world. For example, disruptions in code could result in massive suffering. This problem in particular can be resolved by mechanically shutting down the digital consciousness as soon as some level of pain is reached, or by not allowing the sensation of pain past a certain level at all. I know this sounds simplistic, but our civilisation will be so advanced by then that a working solution will certainly be found. What other problems can arise? I know that some of them are unknown unknowns, but can you give me an example of what sort of thing you are talking about?

          Comment Hidden ( show )
            -
          • bigbudchonga

            You can't sustain that economic growth if your populous shrinks and becomes stupid. IQ is falling is most developed nations; the Flynn affect has been dead for ages in the West. Flynn himself has literally completed research showing a falling level of IQ in first world countries. And basically any nation that becomes first world can't sustain even a replacement level birth rate. What good is it if third world countries get better if they throttle once they hit a certain stage?

            Self-sufficient, full-automation is likely completely impossible, dude. The data does not point in the direction that full automation is possible or even that a single thing has accomplished it; the data points in the direction that there are gains in technology which have massively increased in the modern world. The the amazon article you showed in absolutely no way shows that those robot janitors are a thing. It straight up says in the article "patents and patent applications don’t guarantee the idea will become reality", and the article you shared about the robotic gut is not self-sufficient and sustaining. It can't create more of itself It can't repair itself. Hell, it can't even find its own food source, it has to be provided with one. It doesn't even have programming remotely concerned with the concept of the basic materials needed to fix itself.

            If you don't force everyone to live in a virtual world full time then you'll have people searching to destroy it just to name a single factor. All you need to do is to get them to hack into a single section of AI and it's all over. How are you going to have a self-sufficient piece of technology when everything's going to shit because they've shut off pretty much any single piece of AI which is required to keep the whole system functioning and your human populous has been catacombed in some AI world without the knowledge to deal with these things for the last 50 years; or a single piece of the code goes wrong, things stop working, leading to more things that stop working, and a populous with not even the knowledge to create a piece of wire is pushed into the real world?

            How are you going to deal with climate issues you never knew about at the time? How are you going to fight off countries that don't want to live in a virtual world and are actually creating arms and weapons to take out the machines, or terrorists; all they need to do is find a single weak spot in the programming and defense system to exploit it. How are you going to keep humans even around as a species to enjoy this future if they're so wrapped up in this brilliant virtual world that they don't care about the real one. And if you artificially breed them, then how are you going to programme into the algorithm the evolution that will happen to those humans when evolution is literally cause by random mutation.

            There's not a single piece of technology that has been made that's self-sufficient and fully-automated, and we're not even close to creating one.

            Comment Hidden ( show )
              -
            • SingleUse

              Crap, didn't know about, but it seems like you're right. One hypothesis is that high levels of immigration from less developed countries may have been a factor, but it seems like reverse Flynn effect is real. Increased levels of pollution and decreased amount of fresh air due to increased use of computers and gaming are another possible factor. Still, while this isn't great, this certainly won't impact the rate of technological innovation that harshly, given 1) that the decline is nowhere near as sharp as the increase seen during the Flynn effect era and 2) most innovation is produced by a very small group of individuals at the very end of the curve, whereby most of the variation is the result of randomness, and any changes caused by recent and likely future declines in IQ fall well within the scope of noise.

              Third world countries getting better is better than them not developing at all, even if they eventually throttle. First-world nations will only stop being first-world because they will be economically and technologically surpassed by other nations. Sure, we may lose loads of potential innovators from these countries, but we will gain loads more from the newly-developed nations. So, in my opinion, nothing to worry about.

              If it's full-on completely impossible, what is cut-off point? At which point will technological progress, which has been up and running in an exponential trajectory since the dawn of mankind, suddenly halt? Is there some fundamental law of physics that says "full automation is impossible"? I'd like to hear your response on this one. People like to label everything as "impossible" these days, and such people usually get proven wrong within a couple of decades. Aeroplanes, space rockets, commercial personal computers, etc. I don't and probably won't ever understand the rationale behind calling things to which there are clear tendencies and which are clearly physically possible "impossible". I understand "not feasible in the near future" or, at worst, even "something humanity won't ever reach" for some apocalyptic reasons, but "impossible"? I don't think I will ever get that.

              Okay, perhaps the article I sent you wasn't updated, but the point is that many companies rely on technologies such as the one in the link already (e.g. https://www.nanalyze.com/2018/12/robot-janitors-commercial-floor-cleaning/). Yes, I am aware that these aren't perfect janitors in that they are incapable of, say, picking up larger objects, but, as I have already explained, this is all coming in the near future.

              The gut robot is actually fully self-sustaining: if it were released into a sewer, it would be able to survive all by itself. And, as for self-reparation, that is far too advanced of a process to be feasible today. There is a reason that mechanical engineers are some of the most in-demand professions there are currently, and they are probably going to be some of the last people to lose their jobs. But, if we are talking about self-sustainability and self-sufficiency, we already have that.

              You see, when you are dealing with AI more intelligent than humans specialised exclusively in one field - security - it becomes next to impossible for a human to hack anything guarded by it. I think something like that may or may not be a problem in the early days of the existence of consciousness simulation, but as soon as AI become better than humans at security, there is no chance. Things not working is a more serious problem, but I think that, as long as a large enough number of AI programmes written using different methods is there to watch over any potential errors in the code, we should, most likely, be fine. If one AI doesn't see the error, or two, or a thousand, one out of a million probably will (recall that, by this point, AI will be many orders of magnitude more skilled at writing code - as well as detecting imperfections - than humans).

              As I said, politics will become extinct. "Country" will become synonymous with "culture". There won't presidents or armies - there simply won't be a need for them (I already explained this bit). Also, if people don't trust this virtual world, no one is forcing them to go inside. Terrorists are a bigger issue; honestly, I don't see a perfect solution at the moment, but things like that tend to solve themselves naturally. I reckon the same kinds of questions were being asked when the first nuclear bomb was being developed, and yet here we are, 80 years later, still alive and well. I don't know what kinds of technologies will exist in 200 years' time, which planets humanity will have populated, and just the general state of affairs, but seemingly serious concerns existed for every type of revolutionary technology, and basically all of them were solved when the technology was implemented. But it's a valid concern.

              Finally, humans won't need to care about the "real world". The virtual world will be their "real world". And if you are referring to the biological species "homo sapiens", I don't see why it's important to preserve it. If you want to think of it this way, they will have evolved into a different, non-biological species. I don't see anything wrong with that. The chances are, human curiosity will take over, and many will still want to explore the depths of "real-world" space. Whether they will do it in their natural, biological form or some other form isn't, in my opinion, an important question.

              Comment Hidden ( show )