Friday, May 1, 2026

Random EA Reflections: Part 2

The Car:

    The real question apparent to us all, is how fast do we go? How much do we believe the status quo is good, and optimized, versus racing ahead in the pursuit of relentless progress? The thought experiment is simple: lets say you are driving a car, and your best friend is in the back seat, bleeding out from a gunshot wound. How fast do you drive to the nearest hospital? Let's say the freeway is mostly clear, do you go 80 miles per hour? 100? 120? The faster you go, the greater the odds are you make it to the hospital in time and save your friend. But the faster you go, the more likely you are to lose control and kill your friend, you, and likely other travelers in a horrific car crash. That is a much less complex version of the dilemma with which we are currently faced, and still entirely unclear.


The Knife's Edge:

    We are on the edge of a knife, between utopia and dystopia. At least that is what we are being told. But it's hard to imagine a technologically driven future in which this isn't the case. Unless we manage to avoid centralization and spread far enough apart that the accelerated expansion of the universe makes it impossible to reach other pockets of humanity, we will always be at the mercy of one or two decisions that could lead to the total annihilation of the human race. 


Resolve:

    Pain and suffering are, in a large part, mental. There are plenty of stories of adversity, and certain individuals being wholly unaffected by what others would find extremely distressing. In As a Man Thinketh, and every Buddhist and Stoic text, it is outlined that suffering is a matter of willpower. You can burn yourself alive and not flinch, depending on your resolve. Does this affect qualia? What does it actually mean to suffer, and how does your resolve weigh in? Weak people, in this respect, probably suffer more and experience less joy. Is strengthening the resolve of humanity not the greatest cause area available? On par with the hedonistic imperative, might be a more objective-list-theory imperative to transform all of us into the Buddha, or David Goggins. Or maybe EA's focusing on suffering discounts thousands of years of human wisdom, and suffering isn't the target to optimize against.


Diversification:

    For charitable giving, you should probably diversify your diversification. Personal giving should take into account the odds that you are wrong about your optimization model. It should also take into account the fact that you are neglecting topics that are difficult to think through, and so probably is everyone else.


Murder:

    Murder is wrong, at least according to most people. But it's been pretty weird to see a subsection of apparently normal US citizens cheer on the murder of Brian Thompson (UnitedHealthcare CEO). The positive reaction of many people (including some I know) to the killing has been probably the most disgusting affair of modern American life I've experienced. The "eat the rich" hatred feeding into this moral credibility is disturbing, and those who celebrate murder in the first degree of an innocent man (with a wife and kids, I might add) are sure to have less compelling justifications for murder than the average Christian pro-lifer (who believes millions of innocent babies are being killed every year by women and doctors). The biggest "anti-women hatred" imposed by pro-lifers is usually some version of "you shouldn't be able to kill a baby, even if it's yours"; an alien species might assume the logical initiative would instead be a series of medieval stonings in the street. As someone who is pro-choice, it makes me very happy that civil society has advanced to a state where those who disagree with me politically don't openly campaign for political assassination, and it's quite bizarre to the celebration of violence from young liberals who I'd otherwise consider intelligent. It's as if they have a blindspot in this specific area, amounting to pure intellectual and moral retardation. And it's extremely difficult for me to gauge exactly why.

    The lesson is obvious: the moral high ground is dreadfully dangerous and terrifying, and should be avoided at all costs.


Context:

    The AIs that gain consciousness will probably be so fucking confused.


Fatty Tails:

    Of everyone I've read, probably the person with the greatest impact on my worldview has been Nassim Nicholas Taleb. Once you have his worldview, it's impossible to unsee the implications of a reality that closely maps to extremeistan.


Control What You Can Control, Or Transcend:

    Some people recognize that having the wisdom to know the difference is valuable. Others are even more wise.

    Some people understand that the world is a big, messy place, and it is better to only worry about the things within your sphere of influence. To do otherwise is to be insane. For these people, spending time with your family and friends, and being the best you can be within the domain god handed to you is true meaning. But then there are the others of us, and we have to get to work.

No comments:

Post a Comment

Random EA Reflections: Part 2

The Car:      The real question apparent to us all, is how fast do we go? How much do we believe the status quo is good, and optimized, vers...