THE DUALITY OF TECHNOLOGY
--

The issue of where we're heading as a species and civilization has been weighing heavily on my mind lately, and more specifically, the duality of technological progress.

Technology is the reason why most of us are alive right now. Enabling shelter, and clothing, providing food and clean water. Allowing us to fight off diseases which would have killed our ancestors. Not to mention, making it easy to communicate with people thousands of miles away. Amazing.

There are however some potentially disasterous implications looming in the back of our minds, like lumbering monsters way off in the distance. We know they're coming, but we pretend they're not there.

Here are some of the main examples I've been thinking about. Let me know what you think.

ARTIFICIAL INTELLIGENCE / AUTOMATION

A recent study by researchers at Oxford and Yale predicts that there is a 50% chance that AI systems and automation will accomplish almost every human task better and cheaper than human workers within the next 50 years.

What kind of world will it be when a handful of super corporations and governments own the ways which most of the economic value is generated? Combined with basic income, and it starts to look like a centrally planned totalitarian nightmare where the citizens are dependent serfs, relying on government handouts to survive.

People like Elon Musk and Stephen Hawking have both stated that the birth of AI could also be like summoning the devil. We simply have no idea how a super intelligence would think, and whether it's morals are in any way compatible with what humans think is acceptible.

For example, if you tasked an AI with the problem of getting rid of world hunger, it could very easily come up with a plan to erradicate all life on earth. We would obviously balk at that suggestion, but maybe to an AI that would be reasonable, because it achieves the end goal.

Sam Harris talks about a thought experiment for how to conceptualize the power of a super intelligence. If we gather all the smartest people in the world, and give them 20,000 years to work on every important problem we face, how much do you think they'd achieve? Well, that would be the equivalent of an AI working for 1 week in our time.

One scene from the Animatrix movie really stuck with me, and it's when the machines win the war against humans and begin experimenting on them. Can you imagine the horror of an AI which has figured out the mysteries of life, and can keep human gineua pigs alive indefinitely? How is that any different from the concept of eternal damnation?

OK, well let's say that is fanciful sci-fi nonsense, and the AI aren't actually evil. There's also the very real issue with how AI weapons systems will act in war. What happens when targeting decisions and autonomy are given to city-destroying technologies? Nothing good.

COMPLEXITY

Hardware and software seems to be trending towards ever greater complexity, and that brings a bunch of problems with it.

Back in 1991, version 0.01 of the Linux kernel was released, and it contained a relatively sparse 10,000 lines of code.

Compare that with the Windows XP operating system released 10 years later. It's source contained 45 million lines of code.

Now, put that in perspective with the modern day. Back in 2015, Google employee Rachel Potvin estimated that the size of Google's entire codebase was around 2 billion lines of code. You can imagine it's current size is even larger still.

How on earth can anyone know what's going on amongst all that? No single person or even the entire egineering team will be able to comprehend the entire code base as a whole. That means there will be countless bugs and inefficiencies, not to mention the 0-day security exploits that no-one knows about yet.

My cynical side would even say that the size of the codebase, combined with its proprietary nature means it's very likely there is questionable code hidden in there, spying on users, or doing other things that users wouldn't like if they were aware.

INFORMATION OVERLOAD

Undoubtedly, the Internet has transformed the world in untold positive ways. The access to the near sum of human knowledge is incredible, but there are some downsides to that.

The problem now is that there is so much information, so many different versions of events, so much happening, that it's hard to figure out what is going on, and what's actually true. We're drowning in data.

This exploits the fact that information is near infinite, but our time is not. There is no possible way to check and compare all sources and evidence, then make your mind up, before the next thing comes around. There would be no time to actually live your life.

If you think fake news is a big deal now, think about what it'll be like when it's easy for the ordinary person to create undetectable fake photos, video and audio. DeepMind's WaveNet neural network can already create realistic sounding generative audio based on voice samples. Imagine that in a decade.

CENTRALIZATION

The continued trend of consolidation and centralization of technology companies (Google and Facebook especially) is troubling to me. Their long tendrils basically touch every part of the internet right now, and we can only fathom the amount of data that is stored on each and every one of us.

Of course, centralization itself is a problem because large databases and systems become targets to all sorts of undesirable parties, and as these databases grow, the incentives to gain access to the pot of gold grows too.

The difference between these and government agencies (although they often act questionably too), is that these corporations are actively selling this data to other third parties.

For example, 23andMe has amassed one of the largest DNA databases in the world, and that data is being sold. Think of the implications for the credit and insurance industries if they can know the intricate details of someones DNA and health by buying access to their profile. That could effect not only the individuals involved, but family members, and even those who live in proximity.

CONCLUSION

When I say on Cyber Dump that we're living in an insane age of technology, I truly mean it. We were born into this time of chaos, so it's here whether we like it or not.

In my more pessimistic moments I think fuck it, we're doomed, why not just go live in the woods and give up.

One thing is for sure though, if you let apathy set in, and do give up, there will be less people to counter balance what's happening (perhaps with other technology), and all the evil people in the world will automatically win because nothing will stand in their way.

It's an interesting duality, isn't it? Perhaps it's a deeper issue, inherent to all life. Being alive is incredible beyond words, but the flip side is that we all have to deal with suffering, loss, and the knowledge of our impending deaths.

It seems as though the exponential technological curve we're on heightens the stakes, bringing both endless wonder and betterment, but also drastically increasing the ways we could self destruct.

I don't know what we should be doing about all this chaos, but at the very least, we need to admit these monsters exist, and face them.

SOURCES

https://www.axios.com/experts-predict-ai-machines-will-soon-become-to-superior-to-humans-2429612829.html
https://en.wikipedia.org/wiki/Linux_kernel
https://en.wikipedia.org/wiki/Source_lines_of_code
https://www.wired.com/2015/09/google-2-billion-lines-codeand-one-place/
https://deepmind.com/blog/wavenet-generative-model-raw-audio/

--
BY NODE