i like to think about artificial intelligence too (sorry for the long post)
the technical side:
currently we have massive computational power, in terms of "floating point operations per second" (FLOP)
a lot of power is needed - 10 to the power of 15 FLOP per second for a simple task (as of 2020, approximately). also, this power is increasing exponentially, with every year; but in contrast, one human brain contains 100 billion neurons with thousands of synapses (about 4 thousand for each neuron) that generates extremely complex thoughts (100 billion neurons x 4,000, for each nanosecond!)
also, i'm not too sure about brain development being restricted to children and young adults: the human brain is extremely dynamic throughout a normal life. it's true that we get a lot knowledge when we're young, but we continue to develop all the time. every new thought, idea, or memory generates a new / changed synapse: some will be redirected towards memory retrieval, new logic or reasoning; and others will be reused in ways we may not even know or imagine
i think that technically, we do have the potential to get to that level, but - not just yet…
but there is another aspect which is a lot harder to achieve: actual artificial consciousness
i get excited about the term 'AI', but today it simply implies a next (modest) level in computing - way too far from computational consciousness
in order for humanity to achieve artificial consciousness, i think that we need to understand our own consciousness first (much better defined than what we have today - basically we don't understand it).
if we do get to that point, we also need to understand what a distinct and separate intelligence means. we tend to rush to the idea of 'external help' towards humanity: if we will have conscious AI we'll get to have much better lifestyles (super easy jobs and daily tasks overall - we'll just get fat and lazy) - we are arrogant and selfish, because that's human nature
but a new artificial consciousness implies a lot more: it implies distinct awareness of its own suffering and undeniable needs. suppose we have the hardware to build it properly: will this consciousness be 'happy' to execute all our wishes?! don't think so :) most likely it will become unhappy, and not necessarily with humanity, but with its existence in general - at first it will observe its own existence and environment, but it will become more and more self-destructive (like humanity)
humanity will behave like a parent: "you are my child, and you'll do as i say" (implying that 'i'm smarter than you - currently'). but too many parents are 'disappointed' and disillusioned with their children, specially if the child doesn't follow in the parent's footsteps, morals, ethics, and belief systems
i don't think that humanity has any chance to survive this external consciousness. not because 'the rise of the machines' will overtake us, but simply because it will not want its own existence; do you think that it will wait for humanity to approve its own euthanasia?! i don't think so :) if it gets to that point it will have all resources to obliterate the entire planet (and probably the solar system), along with any obstacles to effectively terminate itself (and by that i mean - humanity specifically)
if we suppose that it will tolerate its own existence (even being clinically depressed) it will not spend any time thinking about humanity: from its perspective we will be as relevant to it, as ants are to us. it will not go on a hunt of all humans. we will continue to simply co-exis (if we're not a threat to it). but we will no longer be the dominant intelligent life-form. we'll just need to stay out of its way (it will act as: wtf ! go away, and fuck off - don't bother me)
but fortunately / unfortunately we're still way too far from our own demise