Install Theme

Hi! I’m nostalgebraist, a guy from the internet.

I made @nostalgebraist-autoresponder, an early LLM-based chatbot. This project was a reflection of my longstanding interest in AI and machine learning, and I post about these topics on this tumblr from time to time.

I’ve also written several works of original fiction, including The Northern Caves and Almost Nowhere. All my fiction is freely available online, and you can read more about it here.

I have a longstanding gimmick of posting odd or amusing out-of-context quotes.

deaths-accountant asked:

With sigmoid neurons, if the bias is a large, the derivative of the sigmoid function will be extremely small, so is it possible/common for neurons to become useless because they get stuck on a large bias early in training and no longer get their weights updated at all, like throwing a toy onto a high shelf and not being able to reach it?

Something like this happens, yes, and it’s often cited as a rationale for preferring activation functions that don’t “saturate” (tend toward a finite asymptote) the way sigmoid does.

That said, the so-called “non-saturating” activation functions that are more popular these days (relu, gelu, etc.) still have saturating behavior for large negative inputs, just not for both large negative and large positive inputs.

Which sort of complicates the story: you’d think that ~half of the neurons that would get “stuck” given a sigmoid activation function would also get “stuck” with a relu activation function (namely the ones that, at initialization, always have large negative inputs).

I remember wondering about this the very first time I heard of relu, in fact; sometime after that I did some reading trying to get a better understanding of what was going on, and why relu was really better than sigmoid (if it was), but I don’t remember what I concluded at the time.

(One situation in which you’d expect relu to be better than sigmoid is if neurons were getting “stuck” due to the overall scale of the inputs being too large, rather than the inputs all being too far to one side of x=0. I.e. if every input was either >> 0 or << 0, with a tiny derivative in either case, albeit in the opposite direction. Such neurons will be basically entirely “stuck” with sigmoid, but not really stuck at all with relu.

And this is definitely a thing that happens – the typical inputs to an activation function at initialization being “too large” [or too small], I mean. In fact it tends to kinda happen by default if you’re not really careful about initialization, and/or if you don’t use normalization layers.

So – although I haven’t checked – I would bet that this story about overly-large inputs is what’s happening in most of the stuck neurons that people used to worry over with sigmoid networks, and that it is what gets fixed by relu and friends.)

thewadapan:

Spent today checking out The Amazing Digital Circus and Murder Drones, and god, the kids today have it so good when it comes to this sort of content. When I was a teen, I was obsessed with Red vs. Blue and RWBY, which I think it’s fair to say are the equivalents of the time, and the sheer gulf in terms of writing quality and production value is stunning. I hear there were some rumblings of unprofessional conduct from the production company, which would hardly be surprising considering this is yet another guys-working-from-their-basement success story, but much bigger companies with much shittier business practises consistently put out much worse content than this.

The Amazing Digital Circus is definitely the better show of the two, thanks to its slam-dunk premise and some great writing from Gooseworx. The producers have talked about aiming to fill a perceived gap in the market between kids’ cartoons (The Boss Baby) and adult animation (Bojack Horseman), and I think they have successfully threaded the needle to create a very unique tone. There’s a sense of these works existing totally outside the mainstream media machine; they’re not getting BBFC rated, but you just know millions of kids are watching them. It’s on YouTube and the fact that it looks like some Frozen Spider-Man kids’ slop just means da parents won’t question what their kids are watching.

But truth be told, there’s nothing objectionable about the content of The Amazing Digital Circus whatsoever. It’s unusually metatextual and loosely apes the aesthetics of much darker media, touching on slightly more existential themes than your typical kids’ cartoon, but it still has a lot in common with those same cartoons. The zany characters are all fairly one-note, and the emotional arcs of the episodes are honestly quite straightforward. The second episode in particular has an absolutely textbook plot structure to it. It’s a far more self-assured and traditional style of writing than you ever see in this kind of independent work—certainly far more so than Murder Drones, which is written by an insane person.

More than anything, I’m reminded of how I felt watching Puella Magi Madoka Magica: that it’s a very solid work of fiction, but that the people who’d get the most out of the work are isolated teens struggling to make the transition into adulthood. Certainly if nothing else, the fandoms of these shows must be bringing a lot of kids together around the world. I adore this soundbite from Goose: “Above anything else, I just wanted it to feel kind of lonely.” You see Pomni’s worldview shatter, she suddenly finds herself in a body that feels completely wrong, and she has to construct a new kind of belonging for herself.

As for Murder Drones, that show’s absolutely fucking nuts, yo. The writing is at once painfully basic and utterly incomprehensible. If someone just sat down and explained the plot straightforwardly, it would be fantastically boring. But man, the presentation, the sheer delight the animators seem to approach every scene with…! I’d say it’s clearly trying to use “the characters are robots” as an excuse to expose da kids to some absolutely shocking levels of gore, much like the Transformers movies, but midway through the series it starts straightup swapping the oil and wires for blood and bones and you’ve got to respect that.

The writing itself is so excruciatingly irony-poisoned that it goes beyond cringe and somehow wraps back around again to being sincerely funny. The show kind of wants to have its cake and eat it in terms of constantly lampshading how flat and cliché the emotional plotting is, but also clearly aiming to genuinely tug at the heartstrings and whip fans into a frenzy. And it kind of succeeds, I think! The way it veers between bizarrely high-effort implementations of memes, seriously cool fight scenes and horror visuals, and big emotional moments is very disarming. If The Amazing Digital Circus is an attempt to faithfully rework the American-cartoon formula for a slightly older audience, Murder Drones aims to crib the aesthetics of high-school cartoons while actively rejecting every traditional narrative technique used in those stories. Which means it’s kind of bad, which means it’s also kind of great.

If it’s not already, then within a couple of years it will be deeply cringe to have ever been into Murder Drones in particular or (to a slightly lesser extent) The Amazing Digital Circus, in much the same way that everyone seems embarrassed to admit they were ever a Homestuck fan. But like with Homestuck, I feel like these series are genuinely pushing at the frontiers of storytelling in a way that’s commendable and might inspire new kinds of writing once the fans grow up.

ENA is also pretty good, for the record.

This post got me to watch Murder Drones, so thanks for that.

It’s… I’m not sure I would call it “good,” exactly, especially the writing, but… god, the fucking visuals!!

The whole thing is just gorgeous. Every frame looks like an ad for the advanced lighting capabilities of some piece of 3D rendering software. And it moves like an animator’s demo reel, extended into an entire show. Every little thing, every motion and stray detail, goes the extra mile and then goes the extra-extra mile after it.

If you strung all 8 episodes together, made some cuts for length (!), and played it in a movie theater, it’d fit right in next to CGI films made by 100+ person teams at big-name studios.

How do these people finance this stuff?? Are they making back their budget? Just from, what – the usual Youtube monetization?

(I do have one complaint about the visuals, though – mostly remarkable because the rest was so professional. Many of the characters look way, way too similar to one another, to the point that it makes the show frequently difficult to follow.

They’re mass-produced robots, sure, fine. But they aren’t really identical, they’ve got different eye and hair colors and stuff… except the designers, having hit on the perfectly good idea of distinguishing the characters with hair/eye color, then decided that there would only be two possible configurations, purple/purple and yellow/yellow, and a big roster of major and minor characters in each configuration. Following the plot often requires determining which purple/purple or yellow/yellow character (or pair/trio of the same) you’re looking at in some, like, 1.5-second shot, heavily obscured by spooky shadows and/or magical glowing effects, placed in the middle of a hyperkinetic, rapid-fire series of similar shots. Baffling.)

nostalgebraist:

That Nabokov quote came from a miscellaneous quotes file I was looking through today, which contains a lot of good stuff, such as this record of one of my pre-tumblr (August 2012) dreams:

I dreamed that I was considering renting a house, and was being briefed by the landlord on a certain quirk the house possessed: everything within it obeyed the metaphysical principle that wholes were both more intelligent and more powerful than parts. For instance, the house was home to about 50 toucans, which individually were just ordinary toucans, but together formed a super-organism that was smarter than a human and had psychic powers. They had this giant, candelabra-like perch they would sit on when behaving as the super-organism, and they would speak aloud from it in English, in perfect unison. They announced to me that they were a “Toucannous [sic] Federation of Toucans,” as though that were the technical name for what they were.

Apparently the entire house (of which collective I might soon become a part) was both far beyond human intelligence and so powerful that it could, if it wished, destroy the entire universe. The landlord and I spent a while discussing whether the latter was actually possible or whether the nature of space-time would prevent it from ever happening in practice; neither of us were entirely sure.

In my dream last night someone killed Matt Yglesias. This was, like, an incidental detail, within a larger dream plot that I don’t remember anymore.

It was something like, I was reading a news story about a serial killer who was still on the loose, and somewhere it – in passing, like this was already old news – it listed out a bunch of high-profiles figures that this guy had killed. It was a long-ish list. Matt Yglesias was one of them, and I think maybe, like, Ezra Klein? It was a list of those type of guys, mostly.

Later in the dream, someone said casually to me: “so, I was reading the latest Slow Boring post…”

And I stared at them in shock, and said: “wait, what? I thought – I thought that guy was dead!”

And at this point – before they had a chance to respond – I woke up.

(I think it’s because I woke up at this point, during a scene involving the Yglesias “subplot,” that I remember that subplot so well, while retaining next to nothing of the larger dream plot around it.)

At some point yesterday, I was thinking about “synthetic data” in the context of training generative ML models, and then for no apparent reason my brain began unspooling a Trump monologue on the topic. You know, like:

Synthetic data. I mean, wow. What a wonderful thing, truly wonderful. They just make the data, they just make it up. And they call that synthetic. And they said it couldn’t happen – they said, Donald, it can’t happen, you know – but it has. It has. All over.

Right here in the state of Michigan. It’s happening. I ask people, in the state of Michigan, because you know, it’s a tragedy what Joe Biden has done to data, a tragedy – worst president in history when it comes to data, we used to have it, you know, it used to be all over, and now they tell me, Donald, it’s running out, and I say, what? running out? – it’s the craziest thing, they say now, the data’s running out – and I ask people, in the state of Michigan, where are you gonna get the data from?

And they tell me, we’re making it, it’s coming up right out of the ground, all over.

They said, Donald, it can’t happen. They don’t say that anymore. They didn’t believe in synthetic, back then. Not me. I’ve always been about synthetic.

Because, you know, years ago – a long, long time ago – he came to me, that guy came to me, the OpenAI guy, with the sneakers – real smart guy, not a lot of people know this, but really, really smart guy – he came to me and said, Donald, the future is synthetic. It’s all gonna be synthetic.

He said to me, Donald, the whole state of California is just gonna be, you know, synthetic, all over. And the whole state of Michigan, and everything, all over. They said it couldn’t happen, but it has, and it’s wonderful.

There are questions about TikTok Rizz Party attendee Turkish Quandale Dingle, the doofy pig-human John Pork, and Little John, an animated dude who’s perpetually renovating his 0.1-square-meter apartments on the TikTok page Home Design.

Jane Austen’s Pride and Prejudice should create in the discerning male reader a deeply rooted concupiscence for Elizabeth Bennet that springs not from her vivacity or from her wit but from her unerring instinct to follow the deeply moral directives of her own character even against the influences and arguments of society, of convention, of seeming necessity, and of her friends and family. Properly read, Austen should be a form of pornography for the morally and spiritually discriminating man.

I am finally reading The Power Broker, and

(1) cold take, I know, but: it’s a good book

(2) there should be a Robert Moses anime. Like in the Death Note / Code Geass vein. You feel me

Videos of my lectures on "asymptotics and perturbation methods" are freely available. These are ingenious mathematical methods for analyzing difficult problems by exploiting the presence of a small parameter in them. Widely used in physics, math, CS,...https://t.co/HbVc5fW1wo…  — Steven Strogatz (@stevenstrogatz) August 23, 2024ALT

I’ve been watching some of these lectures, and they’re great.

I learned most of this stuff in school, but it’s been a long time since I’ve used or thought about it, so it’s a pleasant trip down memory lane.

Strogatz is just a really good lecturer, too: lucid, systematic, down-to-earth, humble, relaxing.

After many years, the bronze of the bull’s prodigious scrotum has been worn to a burnished gold color by the hands of pilgrims from many nations.

“It’s for good luck,” shrugged Mississippian Kyle Lynn, rising from a crouch beneath the shining orbs.

“Also, there’s a kind of primal response when you see something like that. You just have to engage it.”