Leibniz’s Philosophy of Mind

That’s why philosophy is in a crisis right now, because it’s in fundamental dissonance with scientific observation. You may disagree that this is a problem, but at that point you’re implicitly arguing that the intents of philosophy no longer contain a desire to describe and reflect the actual world.

For most of human history, philosophy was expressly designed to perform discovery – albeit speculatively – on universal principles without the invocation of religion. It was understood that any concrete scientific understanding of an area would supersede whatever philosophical construct had covered the same subject before. Now we are at a point where this is apparently no longer the case, and philosophers who refuse to consider current scientific understanding are no longer in a position to claim a strong barrier between religion and philosophy.

But the struggle to deeply accept materialism, intellectually and culturally, is – I hope – only transient. It’s to be expected. For thousands of years, we have speculated about it, without actually knowing, and competing explanations could run rampant with equally unsubstantiated claims. It was only during the last 100 years that we really developed the capacity to make a qualified judgement on this. Materialism won, but it’s going to take a while until we reach the acceptance stage, even among a subset of scientists who invested parts of their lives in old-school philosophy or, say, quantum consciousness quackery.

Deep Learning Hysteria VS AGI Denial

Here’s one in favor of AGI Denial: by Tim Dettmers.

And a few good counter-points by jcannell over at Reddit.

There is some reasonable scientific middle ground between being an unreflected Deep Learning fanatic at one end of the spectrum and being an AGI denier at the other end. In typical internet fashion, we’re mostly exposed to extreme and exaggerated viewpoints that are not above distorting a few things in order to get their message across – and the Dettmers article is no exception.

For Example: Neuronal Firing Rates

Just to grab one strategy used in these articles (again, on both ends of the opinion spectrum): comparing apples to oranges. In this case, it’s the neuronal firing rate. The biological brain uses firing rate to encode values, but that’s rarely used in silico outside biochemical research because we have a better way of encoding values in computers.

One side (in my opinion, reasonably) asserts that this is an implementation detail where it makes sense to model a functional equivalent instead of strictly emulating nature. This view has gained credence from the fact that ANNs do work in practice, and even more importantly, several different ANN algorithms seem to be fit for the job. A lot of people believe this bodes well for the “functional equivalence” paradigm, not only as it pertains to AI, but also as it relates to the likelihood of intelligent life elsewhere in the universe.

The other side asserts that implementation details such as the neuronal firing rate are absolutely crucial and cannot be deviated from without invalidating the whole endeavor. They believe (and I’m trying to represent this view as fairly as I can here) that these are essential architectural details which must be preserved in order to preserve overall function. And since it’s not feasible to go this route in large-scale AI, the conclusion must be that AGI is impossible. A lot of influential people believe this, including Daniel Dennett if I recall correctly.

The Dettmers article is very close to the latter opinion, but it goes one step further in riding the firing rate example by not even acknowledging the underlying assumption and jumping straight to attacking the feasibility of replicating the mechanism.

Sir, Please Step Away from that Singularity

As an aside, I still think we need to either uncouple the transhumanist term Singularity from unbounded exponential growth, or get rid of the word altogether. While there have been some early successes in raising awareness, mostly championed by Kurzweil, the term now carries more factual and social baggage which is used as a handy shorthand by detractors to attack transhumanist ideas. We should probably lay it to rest.

Side Projects Die – It’s OK and Normal

This is a response to “Why Do Side Projects Die?” by by Raj Sheth, the insinuation of the article being this is an indicator of failure, and that failure should be avoided by social means.

Everybody’s GitHub is littered with abandoned stuff, and something needs to be done, right? I disagree, profoundly.

There is a very real evolutionary component to side projects. They are usually born out of an idea, a will to experiment or just because they sound fun. And they are fun, not necessarily due to reaching a certain goal, but as a way to spend time. You could argue that time is wasted, but not every intellectual or emotional pursuit has to be oriented on externally viable goals. Time you spend on side projects could also be time you spend on playing a computer game, crafting something, or any other hobby-like activity. Even though the remnants of old projects can sometimes feel like tombstones or failure markers, for some reason we don’t feel the same way about, say, an old save game file.

But most importantly, I want to circle back to the evolutionary aspect: side projects are explorations. The subject of that exploration can vary hugely, but at the end there is a result. The result may be that the project is not viable or not fun. Or it might be a lesson about programming that will improve your performance further down the road. Occasionally, side projects survive and get to pass on their genes in the form of continued development, or partial re-use, or even a commercial spin-off.

All of this is fine, and it’s part of the process many programmers go through. I would like to invite people to embrace the impermanence and whimsical nature of side projects from the get go. Learn to love what you’re doing, instead of thinking about what the world will be like when you’re done with a project.

Web Design: the First 100 Years

I’m going to draw the ire of both supporters and detractors of this thesis here (just kidding, nobody cares), but this is all wrong, and it’s a deplorable capitulation to technological apathy.

Software Bloat

It’s hard to disagree with the bloat argument in principle, however, I feel a lot of the bloat we see on the web today is exploratory. Right now we are in a meta phase where paradigms are implemented on top of strata of abstractions with staggering amounts of layers. But the intention is not to get away with something, it’s experimentation on programming principles. Over long enough periods, this process is strictly evolutionary.

Reaching Technological Limits

What the essay gets right is the fact that we are approaching the physical limit of certain technologies. This is not per se a bad thing, because it gives us static components to recon with, and it also forces innovation in areas where those limitations are unacceptable. All of this doesn’t mean information technology is “done”. Neither are airplanes, for that matter: the premises for which current airliners are optimized may well change in the future.

Singularity != Transhumanism

The central point I really take issue with is the author’s stance on AI and transhumanism. First off, the singularity (as popularized) does not need to actually happen in order for us to improve upon our biological substrate. And it can be improved, in countless ways. One of these ways, and still the most attractive long-term in my opinion, is a transfer to full simulation. The example of the simulated worm nervous system is a straw man though, because we can functionally emulate way more than a couple of hundreds of neurons – we just need to drop the requirement of accurately reflecting molecule interactions (and we do that, practice). It’s also worth noting that neuro simulations (or emulations as the case may be) can be massively parallelized.

Software is Transforming the World

Second, and the most disagreeable point, is the trinity of connecting, eating, and ending the world. These categories are simplistic distortions of what’s actually going on. It’s not at all clear that connecting and eating the world aren’t the same thing, and it’s all bleeding into the ending field as well. But transhumanists are not world enders, necessarily. It’s not necessary, nor desired, to transform the entirety of Earth towards this single thing, just as it’s not feasible to transition the entirety of humanity to a new substrate. In an evolving ecosystem, there is room for lots of new life forms.

Every time a bacterium or virus changes a gene, every time a plant or animal involves, it carries with it the risk of total world domination. In practice, however, this doesn’t happen – because this kind of unchecked exponential upheaval is unrealistic. I would argue that, exponential expectations in AI or biotech development are also unrealistic – and unnecessary.

Besides, you need to pick your stance: you can’t just point to the worm simulation and say transhumanists are delusional only to turn around a second later to say they’re dangerous because they could end the world.

Killer Whales

Are killer whales persons?

A more modern and future-proof definition of personhood might be based on functional characteristics of a lifeform instead of its outward morphology. But it’s still one big simplistic property, and that’s a bit problematic. Being categorized a person – in much the same way as being labeled intelligent – might more properly be represented by a multi-dimensional gradient rather than a single scalar value. It’s better than the current consensus which is based on a boolean variable though.

Ascribing personhood to a very big mammal is a relatively easy sell. However, no matter what characteristics we choose to define it with (complex social behavior, a sense of ethics, altruism, advanced problem solving, capability for human-like emotions, and so on), we’re liable to end up adding a lot of other mammals to that list as well, plus a few species of birds and possibly some other surprising lifeforms.

It seems to me that the overdue shift in thinking is not about whether or not we should carve out a personhood exception for the killer whale. We should rather rethink the value we ascribe to non-human intelligent life in general.


Reddit: an Incubator of Hate?

What’s up with hate groups living on Reddit?

I disagree that Reddit is an incubator of hate, but the article makes a good point showing it’s a channel for all kinds of communication, including – apparently indiscriminately – racism and other kinds of hate.

In much the same way 4chan was getting increasingly linked in public opinion with the bad things happening in /b/, it appears that Reddit has similar albatrosses around its neck. At that point, it becomes very difficult to get rid of a worrisome user segment – because of their nature and number, any action to drive them away has a huge potential to backfire.

This is a bad position to be in for a business: you need to placate a growing number of mean-spirited users who constantly probe the boundaries of what they can get away with, and yet you have to do everything you can to avoid the problem that your brand is publicly identified with the hate groups residing on your site.

I want to point out that the goals of a company having these problems are not necessarily the goals of people who want to deprive hate groups from having a public and company-supported outlet.


Killing off Wasabi

Fog Creek is getting rid of their DSL.

> Building an in-house compiler rarely makes sense.

I disagree. First of all, Wasabi solved a real problem which doesn’t exist anymore: customers had limited platform support available and Fog Creek needed to increase the surface area of their product to cover as many of the disparate platforms as possible. Today, if all else fails, people can just fire up an arbitrarily configured VM or container. There is much less pressure to, for example, make something that runs on both ASP.NET and PHP. We are now in the fortunate position to pick just one and go for it.

Second, experimenting with language design should not be reserved for theoreticians and gurus. It should be a viable option for normal CS people in normal companies. And for what it’s worth, Wasabi might have become a noteworthy language outside Fog Creek. There was no way to know at the time. In hindsight, it didn’t, but very few people have the luxury of designing a language which they know upfront will be huge. For example, Erlang started out being an internal tool at just one company, designed to solve a specific set of problems. Had they decided that doing their own platform was doomed to fail, the world would be poorer for it today.

There are lots of reasons why maintaining and using Wasabi was no longer feasible, but that doesn’t mean people should abstain from developing their own languages and platforms just because it lead to nasty code that one time at Fog Creek! And it’s a “failure” that – from what I can tell – was a pretty good business decision at the time. But even if you discard that, even if you assert this was nothing but badness with no upside whatsoever, it would still be a learning experience.

When a rider falls off a horse, they have to make a decision: abandon the idea of horse riding (“riding your own horse rarely makes sense”), or apply that as a lesson to your skill set and re-mount. While there is nothing wrong in deciding that, upon reflection, horse riding was not for you, I think it’s harmful to go out and announce to the community that riding your own horse rarely makes sense and should be left to the pros. Because what you’re left with then is a world where the only riding is done by dressage performers.

(Sorry, that analogy got a bit out of hand, and admittedly I know nothing about horses, but I hope the point is still discernable.)

Destructive Discourse Between People with 95% Overlapping Opinions

I’m an atheist. I think religion holds no valid claims over the nature of reality and is overall a force of evil in the world. I am for gender equality, and I do believe that while gender discrimination exists towards both sexes to an absurd degree, that overwhelmingly it’s women who find themselves at the shitty end of these traditions. Both of these convictions are at war with each other, without any compelling reason, on social media.

Maybe somewhat disappointingly, my personal opinion in these matters comes from an internal default position rather than extensive cultural exposure. I never really considered religious beliefs to be something else than fairy tales. I never really think about myself or other people primarily as representatives of a gender, so I don’t have any gender-specific expectations or conventions in mind, and I kind of expect other people to be the same way. (There’s a recent Vi Hart video that pretty much sums up my mindset in this regard: https://www.youtube.com/watch?v=hmKix-75dsg)

I follow a couple of atheists and feminists on Youtube, and like in all such things, I find myself agreeing with them sometimes and disagreeing at other times. I consider this normal: you don’t subscribe to a person as a whole, but certain people are more likely to reflect your views or to make interesting arguments than others.

The problem is ever since Gamergate, I’ve been watching those groups tear each other apart. Atheists and feminists seem to be at each others’ throats a lot, but what’s more there is actual some bitter conflict between, say, people promoting gender equality and others carrying the feminism flag.

We’re now at a point where some labels, although they may have been convenient at some point, simply don’t mean a lot anymore – yet, they that’s also the time where fighting over perceived allegiances seems to be most fierce and ultimately most pointless.

The sad thing is, people are using this to have pointless, personal fights – and watching people you actually liked behaving in this way is disappointing. For example, you get a lot of atheists suddenly viciously attacking Anita Sarkeesian: the person, not her opinions. Do I agree with everything she ever said? No. But do I think there are some valid points in there worthy of further discussion? Hell, yes! By the same token, more often than not the Amazing Atheist is to me a thoughtful guy expressing a lot of stuff that makes sense to me, but there are other times when he expresses things – especially when he’s doing the Drunken Peasants rounds – that I just can’t agree with. People who would otherwise be characterized as reasonable, such as MrRepzion, sometimes stoop to low-grade personal attacks and what looks like willful misinterpretations of what adversaries said.

All of this happens for no particular reason, and to no particular end, other than to have a war. These are people who could have led a meaningful discourse on the subject, but instead they chose to become worse enemies to each other than the groups who they actually purport to fight. Instead of providing a cultural counterweight to the growing influence of conservative ideas in public opinion, they prefer to be at each others’ throats, doing their very best to personally destroy self-declared adversaries who actually agree with 95% of their opinions.

Maybe I’m missing something. Chances are, I am. The most obvious explanation is that all of these people need to make a living off their channels and speaking engagements. And to be honest, I kind of envy they have that option. Maybe it should give me pause that I don’t see anyone in my auto-suggested YT playlist who is willing to offer a reasonable and rational alternative opinion – it’s just escalating polemics and name calling all the way down. That’s probably because reasonable people get like 3 views and we’ll never know they exist.

However, if you do need conflict and drama to get subscribers, why choose something so destructive to your own cause?

Peace the fuck out.

TV: The Dark Matter Pilot (or: a Belated Eulogy for SGU)

I like to watch pilots of TV shows. It’s generally understood that they tend to be rough around the edges, neither the characters nor the plot are expected to be fully fleshed out. But they need to show promise, they do need to make some argument about why this show will be interesting to watch. Most people will likely empathize when I say 90%+ of series don’t hold any interest for me at all. So when I feel compelled to write about something I liked, that’s a real sign of passion – and when I am disappointed in something to the point of blogging about it, it is also because of passion. Science fiction is, among other things, a subject I do care about and it’s also a ready source of disappointment because it’s so easy to get wrong – both subjectively and objectively.

The opening scene of Dark Matter shows us a starship adrift, switches from an overview exterior of the vessel to a camera gliding through empty, damaged corridors until finally centering in on a bunch of confused people who just arrived at the place. I worded this a little vague because that’s exactly the opening scene from Stargate Universe, too.

The differences in execution could not be more pronounced, however. I was never a Stargate fan before, but when that wonderful, rusty old space ship dropped out of FTL to a perfectly orchestrated soundtrack, when the camera panned through the corridors as ancient machines came back to life for the first time in thousands of years, when the view finally centered on that open portal with a stream of panicked evacuees coming through it, I fell in love with that show. Within the first two minutes. The set design was so unique, the space ship built with love and attention to detail, the characters had so much chemistry that I only figured out much later they were in fact not a spinoff of some pre-existing ensemble of characters and actors. These things made me more than eager to forgive and forget many, oh so many, of the series’ flaws for most of its run.

Judging from the pilot, however, Dark Matter could not be more different although it is very clearly and intentionally positioned to fill the niche SGU left behind. Where SGU’s actors and characters were varied and (mostly) interesting, the people in Dark Matter are all young, beautiful, and exceedingly bland. Maybe the utter lack of personality is by design, because the writers literally gave these characters only numbers in lieu of actual names. The problem is though their generic qualities are so oppressively emphatic, as a viewer I really don’t care about any of them either.

Dark Matter’s set design and overall cinematography is so badly done it actually made me laugh out loud at several points while watching it. I swear some of these scenes looked and felt as if I was watching raw footage from an irony-free science fiction LARP, or alternatively, a bunch of business school students in a laser tag arena.

When a generic science fiction show was featured on Castle one time it had a more thoughtfully built set, better CGI, and special effects than Dark Matter. And remember, on Castle it was meant to be funny, but on Dark Matter I couldn’t help but chuckle when I saw the twentieth generic spark pyro go off in that generic ship corridor (to hammer the message home that the ship is damaged), or whenever the hilariously bad computer console action was taking place (which was often). My suspension of disbelief was not just interrupted a few times, at no point was any suspension ever allowed to build up in the first place.

And the plot, oh the plot. Amnesia. Literally. It’s like that Buffy episode where Willow had erased everybody’s memory, only without the lovable characters. It could still have worked though, if the show had any atmosphere, or interesting figures, or even just some cool tech.

I apologize if anyone involved in the show ends up reading this, but as a viewer I felt like you were mocking your audience. Like you didn’t care how unconvincing the ship looks, or how strenuous it would be to reach into the deepest wells of patience to muster some semblance of interest for these forgettable characters. The trouble is, and we both know this, dear hypothetical show runner, that you clearly did care at some point – and so did I, obviously, that’s the source of my disappointment. It comes from the fact that this could have been great. A show like that doesn’t get made without people being passionate about it, and it’s so extremely sad that none of this passion made it across the screen. I don’t claim that I could have done better, but I do claim what probably sank this show before it began was there’s obviously someone missing at some place in the production process, someone who says “no” to things.

SGU, for all its many problems, in hindsight, probably had such a person on the team. Someone who cared about making a couple of painted plywood walls and 3D models into an actual place inside the minds of their viewers. Someone who cared about making good hard-science-fiction plots instead of just doing variations on Farscape. Someone who cared more about hiring interesting actors and writing actual characters than merely trying to appeal “to a younger audience”. Of course, on SGU, that hypothetical person didn’t always win. But they made enough of an impact to foster a real connection to the show.

It seems to me, from watching the pilot, nobody on the Dark Matter team cared about these things enough to fill that role. To do good science fiction, it’s not merely enough to append “…in space” to plots and sprinkle them liberally with tech concepts pilfered from 50 years of shows that came before. It must be intrinsically cool. It must appeal to nerds and normal people alike. It must take risks, and it must above all else create a believable world for 40 minutes at a time. These things are incredibly hard, because on top of all that you still have to deliver all the other features people expect from entertaining shows. I couldn’t do this, not even remotely. But I do see when it’s not working. I do see when writers and producers don’t care in places where they should have. And it disappoints me because all that budget, all those years of hard work, all that promise, and all that vision was just dumped on a barren field, and left to rot because someone assumed sci-fi nerds would eat anything.

I regret not the hour spent watching this and blogging about it, I regret that it’s another opportunity unfulfilled.