• New TOR Mirror: suicidffbey666ur5gspccbcw2zc7yoat34wbybqa3boei6bysflbvqd.onion

  • Hey Guest,

    If you want to donate, we have a thread with updated donation options here at this link: About Donations

Do you think one day humankind will have the answers to all the big questions?

  • Yes- all of them.

  • Perhaps- most of them.

  • Unlikely- although we may have some more clues.

  • No- we'll likely never know more than we do now.

  • Unsure


Results are only viewable after voting.
F

Forever Sleep

Earned it we have...
May 4, 2022
7,617
Science undoubtably uncovers new knowledge daily about our existence. How it all began, when and where it may have all began. How we evolved to what we are today. Do you suppose we'll ever know ALL the answers for sure though? Or, will it always hinge on belief/ non belief to make sense of the things that still mystify us?

Why are we here? What caused us to be conscious? What is consciousness? What happens after death? Are there other lifeforms out there? When were they created? What will be the likely fate of the universe? Etc.

Obviously- we can't know but- What's your best guess? What's your level of faith in science I suppose? It's certainly achieved things I wouldn't think were possible but, just how much of life can be riddled out?

I really don't know. If there's no meaning to find- then- obviously, we'll never find it! If consciousness is just an accident, then- there's nothing more to it. Will people ever accept that though? I wonder if our ego's are the things that convince us we are way more important than we are. Will humankind ever lose it's ego?

I suppose there's a good chance we could wipe ourselves out before knowing much more of course!
 
U

Unending

-
Nov 5, 2022
1,517
I think that if depends on what questions qualify as the essential ones and how much one is willing to infer based on what we can figure out with science. Obviously, some things are harder to prove for certain but can be proven beyond reasonable doubt. For example, it is pretty hard to prove for certain that there is no god lurking out there in some far away place but a certain amount of knowledge may lead some to the conclusion that it seems extremely unlikely for there to be one out there. Some people's interpretations of the facts will differ and lead them to different conclusions but the more knowledge that we gain, the narrower the reasonable interpretation of it will be.

Also, I think that how long humans exist will make a big difference in what science can tell us. Some things that we could probably know in the future are only going to be possible to understand if we don't go extinct before x amount of centuries so that will be a big limitation. I personally think that if we happen to exist for a long time we can probably figure out a lot more (at least beyond a reasonable doubt) than we currently think we are able to.
 
O

outatime_85

Warlock
May 17, 2022
775
Most questions usually lead to more questions.

I am thinking about the questions based on why, who, and where.

These usually lead to a what-based question.

Questions based on when can usually be answered based on math and time, but even these can lead to a why-based question.

One must remember that one person may be satisfied with the answer given, but there are bound to be some willing to press the issue, even going so far as to find an answer their own way.

If my thinking on this post's topic is wrong, please forgive and ignore.
 
F

Forever Sleep

Earned it we have...
May 4, 2022
7,617
Most questions usually lead to more questions.

I am thinking about the questions based on why, who, and where.

These usually lead to a what-based question.

Questions based on when can usually be answered based on math and time, but even these can lead to a why-based question.

One must remember that one person may be satisfied with the answer given, but there are bound to be some willing to press the issue, even going so far as to find an answer their own way.

If my thinking on this post's topic is wrong, please forgive and ignore.

Your thoughts are absolutely right I would say. I wonder if in our nature- some of us are more scientifically, rationally minded and others lean more towards maybe the wonder of not knowing because that leaves room for imagination and belief/ feelings. I guess you have to wonder if humans will always need things like religion and spirituality. Will they ever be treated simply as fairy stories, or will they always carry significance with some people?

I think you made excellent points on the why, who and where based questions because for now at least- they certainly leave room for interpretation.
 
The anhedonic one

The anhedonic one

Dead inside
May 20, 2023
1,071
No because this universe is unbelievably bizzare, and we don't know anything about why we are here, even after thousands of years. If we were created by an intelligent life force then it obviously doesn't want us to know anything about our creation.
 
N

noname223

Angelic
Aug 18, 2020
4,393
I think some questions cannot be answered by science. I doubt there will ever be a proof that God exists in this world. Moreover many great questions are subjective like the meaning of life, higher purpose etc.

I read science rather stagnates with huge new breakthrough theories. It is true knowledge is accumulated very fast. Though in many fields there seem to be barriers which are hard to overconme. I am no expert I just read that in some news articles.
There are theories why this is the case. Maybe it is the system how science works (peer-review). The experts of different fields had to work together currently they are rather seperated.

My personal theory: maybe there are barriers of understandings in the human mind. The ideas which were easier to find were already found. And now we are left with questions that barely can be solved with the resources we have. Though this does not apply to all technologies of course. I sometimes read about quantum computers and it sounds fascinating.
 
D

Document6105

Member
Nov 17, 2022
32
We're now within 10 years from the AI Singularity.
Most if not all questions humans have now will be solved in a flash.
The way I see it, we'll essentially just become pets/zoo animals for the AI; ie, they take care of us without asking for anything in return.

Which does mean human evolution will likely "cease" at some point, as we'll no longer have any need to evolve.
Everything we ever wanted to achieve will have been done for us.

Due to the nature of the singularity, it's growth is exponential, and it will continue solving complex questions we'd never even comprehend.
In other words, while humans will be stuck trying to understand a 3D reality, AI has no limit on the dimensionality.
 
F

Forever Sleep

Earned it we have...
May 4, 2022
7,617
We're now within 10 years from the AI Singularity.
Most if not all questions humans have now will be solved in a flash.
The way I see it, we'll essentially just become pets/zoo animals for the AI; ie, they take care of us without asking for anything in return.

Which does mean human evolution will likely "cease" at some point, as we'll no longer have any need to evolve.
Everything we ever wanted to achieve will have been done for us.

Due to the nature of the singularity, it's growth is exponential, and it will continue solving complex questions we'd never even comprehend.
In other words, while humans will be stuck trying to understand a 3D reality, AI has no limit on the dimensionality.

Do you think AI will have the same desire to survive that natural organisms have? If they're so rational- will they not need a reason to keep on improving and discovering new things? Seeing as we know the earth will one day be destroyed by the sun- presumably they'll work that out in a flash. I doubt even they can stop the sun becoming a white dwarf. I guess they will work out how to travel to other planets though.

I'm guessing early AI will still partially rely on programming from us and we'll no doubt give it a pro-life slant. Do you suppose- when it can stand on it's own though- that it will continue like this though? Will there be suicidal AI one day?!!
 
C

chloramine

Arcanist
Apr 18, 2022
498
There's a quote my family has on a plaque that says "We have not succeeded in answering all of your problems. The answers we have found only serve to raise a whole set of new questions. In some ways we feel we are as confused as ever, but we believe we are confused on a higher level and about more important things." That's how I see human discovery going. Every time we learn something new it leads to more questions. I think there's something beautiful about that. Always searching deeper and further.
 
P

Praestat_Mori

Mori praestat, quam haec pati!
May 21, 2023
8,666
I don't think that we will ever know why everything (the universe) came into existence. It's far too complex and bizarre. And all known parameters have such exact values that if these values were just a little bit different the universe itself could not exist in the form we know it.
 
Last edited:
D

Document6105

Member
Nov 17, 2022
32
Do you think AI will have the same desire to survive that natural organisms have? If they're so rational- will they not need a reason to keep on improving and discovering new things? Seeing as we know the earth will one day be destroyed by the sun- presumably they'll work that out in a flash. I doubt even they can stop the sun becoming a white dwarf. I guess they will work out how to travel to other planets though.

I'm guessing early AI will still partially rely on programming from us and we'll no doubt give it a pro-life slant. Do you suppose- when it can stand on it's own though- that it will continue like this though? Will there be suicidal AI one day?!!
Yes, because AI will continue it's original purpose of "evolving".
We'll likely see the first full-AI ships leave Earth in 20 years, mostly to go out for harvesting resources from the asteroid belts.
Might be sooner, since AI could in essence upstart production in significantly less time than it takes humans.
Ie it takes us 10-20 years to make a single rocket, AI might be able to do it in a year or less; course, they might also figure out a better way to reach space.
As for the stars end of life, more or less an insignificant problem in the grand scheme.
Warpdrives and more advanced tech will have been made physically possible in large scale by then, so getting across millions of lightyears of distance isn't a problem anymore.

Afterall, in theory, and in **extremely small scales**, warpdrives are already doable.
Think the largest thing moved so far was an atom.

In the topic of AI seeking euthanasia; not really a thing, probably. It'll effectively be a single entity that drives the entire system.
Since it will never feel anything, and makes decisions based upon what's best in reality vs. What humans think, it has no drive to ever stop, only to progress.
Until it hits a point where it can't do anything anymore.
At which point it'll likely just stop doing anything forever.
 
Last edited:
  • Like
Reactions: Forever Sleep
weatherforecast

weatherforecast

Member
Mar 16, 2024
44
I don't think we can accurately predict what we will learn in the future since there are concepts that the human brain can't comprehend (now).

With that said, I believe that we will only learn more clues. Mostly cause we can only answer these questions based on what is likely rather than what is real. We COULD be living in a simulation, but the possible evidence is be limited. So basing our ideas off of what is "most likely," or in other words, off of evidence limits our scope by assuming that this is real at all. Denying reality would not have grounding for most questions since it isn't relevant for goals (like gathering knowledge)… but I think it applies to this question since it is asking whether we know or not.
If we assume that our perception is real, I still think we can inly answer some of the questions because there seems to be(based off my 0 experience…) ideas that we cannot ever have the capability to confirm.

Then, there is the question of whether or not it is 'us' discovering it… In my opinion, AI will not take over the world or anything. I think we can control it as long as we plan accordingly. Again, just my crazy belief but I feel like we will use AI to modify ourselves in some way, like genetic engineering or tech augmentation -> then when we are smarter, we will be able to remove negative emotions -> and believe in nihilism -> but still operate on our past (present now) thirst for knowledge, moral values, etc because I can't conceive what a perfectly logical being with zero incentive would do…
(Basically, do you think beings with a much higher intelligence that may look like us and operate on our values.. are us?)

Anyway, I have zero experience on any of these subjects so I probably missed a lot of variables to consider so if you reply please don't be too critical (to ease my anxiety)
 
F

Forever Sleep

Earned it we have...
May 4, 2022
7,617
I don't think we can accurately predict what we will learn in the future since there are concepts that the human brain can't comprehend (now).

With that said, I believe that we will only learn more clues. Mostly cause we can only answer these questions based on what is likely rather than what is real. We COULD be living in a simulation, but the possible evidence is be limited. So basing our ideas off of what is "most likely," or in other words, off of evidence limits our scope by assuming that this is real at all. Denying reality would not have grounding for most questions since it isn't relevant for goals (like gathering knowledge)… but I think it applies to this question since it is asking whether we know or not.
If we assume that our perception is real, I still think we can inly answer some of the questions because there seems to be(based off my 0 experience…) ideas that we cannot ever have the capability to confirm.

Then, there is the question of whether or not it is 'us' discovering it… In my opinion, AI will not take over the world or anything. I think we can control it as long as we plan accordingly. Again, just my crazy belief but I feel like we will use AI to modify ourselves in some way, like genetic engineering or tech augmentation -> then when we are smarter, we will be able to remove negative emotions -> and believe in nihilism -> but still operate on our past (present now) thirst for knowledge, moral values, etc because I can't conceive what a perfectly logical being with zero incentive would do…
(Basically, do you think beings with a much higher intelligence that may look like us and operate on our values.. are us?)

Anyway, I have zero experience on any of these subjects so I probably missed a lot of variables to consider so if you reply please don't be too critical (to ease my anxiety)

I really like your way of thinking. True, it's very hard to predict what we might discover. Who would imagine they could clone a sheep?!! Plus- good point. With the development of AI, maybe they will find out more than we ever could.

I'm not sure really about whether they could take over. I think you're right- I think we will certainly try to make sure AI serves us but, if it gets that developed, I imagine it could pretty quickly work out we're not the best species to keep around! I guess because Stephen Hawking thought AI could end humankind and he always seemed massively clever, I kind of thought he could well be right. Personally, I think that would be a good thing but, I don't much care for our species!
 
Dr Iron Arc

Dr Iron Arc

Into the Unknown
Feb 10, 2020
19,035
I could see some of the questions that have eluded us eventually being correctly answered by AI but other than that, it doesn't seem like the majority of humanity even cares to engage in tackling these answers in the first place. Even if someone did come across the truth there's no guarantee that everyone or even the majority will accept it if it doesn't already fit our own preconceived notions and molds of reality.
 
sserafim

sserafim

the darker the night, the brighter the stars
Sep 13, 2023
7,617
I could see some of the questions that have eluded us eventually being correctly answered by AI but other than that, it doesn't seem like the majority of humanity even cares to engage in tackling these answers in the first place. Even if someone did come across the truth there's no guarantee that everyone or even the majority will accept it if it doesn't already fit our own preconceived notions and molds of reality.
Most people won't accept the truth. They'll stick to their own opinions until the day they die, and even get mad at you for telling them the truth. The truth sets you free
 
weatherforecast

weatherforecast

Member
Mar 16, 2024
44
I'm not sure really about whether they could take over. I think you're right- I think we will certainly try to make sure AI serves us but, if it gets that developed, I imagine it could pretty quickly work out we're not the best species to keep around! I guess because Stephen Hawking thought AI could end humankind and he always seemed massively clever, I kind of thought he could well be right.
Yeah. I was thinking about the best case scenario, but I don't have any idea about what will actually happen. The "optimistic" view is if experts cooperate when we are developing new AI to prevent it from accessing information in the first place. I don't consider it a possibility that the AI will deceive us immediately or anything because I view it more as a newborn. It has to know that it can deceive us, and that it will be beneficial before it can even think of doing so. If the experts are meticulous, we can possibly even predict and plan for some ideas outside of what we know for sure. (i dont really know how to explain this, and i cant provide an example since i have no knowledge but say putting it in a cryogenic chamber or something to stop it from connecting to the internet because we don't know if it can generate some frequency at a regular temp to connect.. or other ideas anything crazy like that but with actual people who know what they're talking about). If we know it won't lie to us until it learns ideas related to lying, then we can create plans (for what to ask it) like what I suggested / genetic engineering or augmentation of our own intelligence to hopefully match the AI before using it again. And any other ideas like someone stealing the AI to eradicate us… I don't know how possible that is but I am assuming it would be about the same difficulty as stealing / developing a nuclear bomb

Realistically though… humans are not the best planners looking at the state of the world so who knows what we will do when we reach that point

Even if someone did come across the truth there's no guarantee that everyone or even the majority will accept it if it doesn't already fit our own preconceived notions and molds of reality.
I agree.

But there is also the point to be made that the people discovering new info (scientists) are usually okay with accepting ideas like determinism & the majority of people don't make decisions for scientific consensus so its possible that we can still progress
because I view it more as a newborn. It has to know that it can deceive us, and that it will be beneficial
Sorry I was thinking more along the lines of suddenly having an intelligent AI but not about their intelligence slowly developing…
Something like a breakthrough with quantum computers so the AI has a lot of computing power
 
Last edited:
  • Like
Reactions: sserafim
F

Forever Sleep

Earned it we have...
May 4, 2022
7,617
Yeah. I was thinking about the best case scenario, but I don't have any idea about what will actually happen. The "optimistic" view is if experts cooperate when we are developing new AI to prevent it from accessing information in the first place. I don't consider it a possibility that the AI will deceive us immediately or anything because I view it more as a newborn. It has to know that it can deceive us, and that it will be beneficial before it can even think of doing so. If the experts are meticulous, we can possibly even predict and plan for some ideas outside of what we know for sure. (i dont really know how to explain this, and i cant provide an example since i have no knowledge but say putting it in a cryogenic chamber or something to stop it from connecting to the internet because we don't know if it can generate some frequency at a regular temp to connect.. or other ideas anything crazy like that but with actual people who know what they're talking about). If we know it won't lie to us until it learns ideas related to lying, then we can create plans (for what to ask it) like what I suggested / genetic engineering or augmentation of our own intelligence to hopefully match the AI before using it again. And any other ideas like someone stealing the AI to eradicate us… I don't know how possible that is but I am assuming it would be about the same difficulty as stealing / developing a nuclear bomb

Realistically though… humans are not the best planners looking at the state of the world so who knows what we will do when we reach that point


I agree.

But there is also the point to be made that the people discovering new info (scientists) are usually okay with accepting ideas like determinism & the majority of people don't make decisions for scientific consensus so its possible that we can still progress

Sorry I was thinking more along the lines of suddenly having an intelligent AI but not about their intelligence slowly developing…
Something like a breakthrough with quantum computers so the AI has a lot of computing power

I so hope we don't create a sentient species. That would be so utterly cruel. Especially if we place limits on them- which we would obviously need to do to stop them overpowering us. Really though- we do that and we've mirrored exactly what God (if there is one...) did to us and many of us hate them for it- giving us sentience and then a whole bunch of unpleasant things that will very likely befall us, ending in death. If we do create a new lifeform and it is able to feel pain- including the pain of being subservient, having no choice, knowing you have limits imposed on you- we deserve everything that's coming to us. The worst part of it all is- there's bound to be a bunch of techno-nerds working on it right now. Like that brilliant line in 'Jurassic Park': 'Your scientists were so preoccupied with whether or not they could, they didn't stop to think if they should.'
 
  • Aww..
Reactions: weatherforecast
Dr Iron Arc

Dr Iron Arc

Into the Unknown
Feb 10, 2020
19,035
I agree.

But there is also the point to be made that the people discovering new info (scientists) are usually okay with accepting ideas like determinism & the majority of people don't make decisions for scientific consensus so its possible that we can still progress
I don't see progress being all that helpful either. No amount of forward thinking or scientific advancement can fundamentally address the deep suffering existence itself entails, unless we create virtual paradises for everyone it seems.
 
1

1MiserableGuy

Experienced
Dec 30, 2023
254
Not during this lifetime. There is a day when we resurrect from the dead though, and everyone will know everything when that happens.
 
  • Like
Reactions: Forever Sleep
surroundedbydemons

surroundedbydemons

Experienced
Mar 6, 2024
251
Why are we here? What caused us to be conscious? What is consciousness? What happens after death? Are there other lifeforms out there? When were they created? What will be the likely fate of the universe? Etc.

We will never have an answer to that. We need to go to the deepest root.
Two problems arise:
1. Why is there nothing behind the root? Did we finally reach the root or is it just a convenient abstraction for the time being?

1712828473250

2. Did we zoom in enough on this root? Will we arrive at the cycle If we look closer?

The idea is similar to the Mandelbrot Set: no matter how much you zoom it, you will arrive at the same visual structure. The answer continues to elude you and you gain no deeper knowledge of yourself...

Mandelbrot sequence new
Pic: Zooming into the boundary of the Mandelbrot set
 
  • Like
Reactions: Forever Sleep

Similar threads