Regarding recent rulings of the US Supreme Court…

OK, let’s get one thing out of the way. Multimedia and game development is hard work. It’s fulfilling work, it’s often (but not always) enjoyable work, but it is an absolute mountain of a task. It often requires extreme focus, thorough organization, and mindful physical and mental health.

That said, the recent supreme court rulings, including (but not limited to) the overturn of the half-century-settled Roe vs. Wade, have been an enormous stressor. I’m fortunate, in that I live in New Mexico where actions are regularly taken by our government to protect citizens from these rulings, particularly the out-of-the-blue and in this person’s opinion, unconstitutional, ones; but it does little for the stress.

Texans are my countrymen, in that we are brothers. Texan women are also my countrymen, and they are my sisters. Florida, West Virginia, Missouri, Wyoming… some of these places, I once lived in myself. There’s a divisiveness in my country right now, with which it has become too easy to point fingers at the other side of the line and claim that, on account of some mistake legislatively made, all of the people living there are the same, or are even in support of it (or even in majority support of it). That’s just not true.

It isn’t that I’m not loaded down with anger right now, I am. A very house of government is beyond a legitimacy crisis. The right in my nation’s politics has become the extreme-far-right, siding with unity over reasonability as though, if everyone sinks together, there will be some merit over changing sides and fixing the leak. For nearly ten years, the UN has classified lack of access to abortion care as torture. It reiterated on that recently, after Roe vs. Wade was overturned. It’s right, childbirth is both painful and dangerous, and should be forced on no one.

There’s been another case recently where the court stripped the EPA, and sibling agencies such as OSHA, the FDA, and even the FBI, of their ability to effectively go about their core purpose. This has lead us into a concerning potential treaty violation with China, among others, and I can’t help but feel that the aim of this court has been to break down all forms of federal control.

The six justices who consistently vote conservative on these issues were all put in place, for an exasperating lifetime position, by presidents who lost the popular vote. Three of them were from a president who was an insurrectionist after losing the election. One of them is married to someone who was actively involved at the January 6th attempted insurrection in 2021.

I wish I didn’t have to write a piece like this, but it is not just affecting my workflow, it’s affecting the software and game industry. Texas has been a game development hub for years, and many of us are women. Many of us are minorities, or homosexual. We’re in this whether we like it or not.

In the future, right now will be classified, in retrospect, as the second American civil war. But war has changed; during the first Civil War we didn’t have cruise missiles or a strong navy. The air force wasn’t even a concept. The only automatic weapons we had were stationary Gatling guns, which weren’t in wide deployment. So we fought, and we beat each other in the mud and the blood until one side had to step down. This was only a hundred and fifty years ago.

This claim is not without precedent. In the winter of 1861, after Confederate soldiers had bombarded Fort Sumter and shots had already been fired, even the best and brightest in the country refused to believe what was happening. In fact, South Carolina senator James Chestnut—one of the ultimate instigators—promised to drink all of the blood spilled in the conflict, popular belief at the time maintaining that it would be “not a thimble”. No one wanted a war, so no one wanted to believe that it was already happening.

I take sole satisfaction in knowing that we can’t regress as much as this Supreme Court would have us regress, as that ship has indeed already sunk. You cannot survive in the twenty first century with nineteenth century ideals and methods.

I will continue to write my algorithms, my books, and my games and programs; I will continue to publish. I have to. Even in Ukraine, development of S.T.A.L.K.E.R. 2 is continuing right now, in spite of the fact that most of the developers are firing actual assault rifles at intruding Russian forces during the day. Why? Because life is not a singular task, and this presents a sense of normalcy to people who need it.

Summarily, this is a message of love and solidarity. It’s a reminder that this is neither the first nor the last storm we will weather, and we have overcome the previous ones. It’s a reminder that the best and brightest in this country know that this outlandish violation of basic human rights, even if it should stand for a while, will perish in the end under the steel-toed boot of the righteous.

For those who happen to be on the other side of the argument, know that I will forgive you, in spite of everything; but first, we both need you to step down. We’re going to be mired down for the next decade, sure; but one way or another, this won’t work in the end.

IJobs in Unity, with Random Number Generation

To provide background, I’ve been working on a project lately, involving generating random terrain in Unity based on a combination of Worley noises and Voronoi noises (cell noise), with a few matrix-based operations in them. This is a logically straightforward task (given that you understand how your noises work!) but ultimately very computationally expensive.

If I was using something like my-main-man C, I wouldn’t be so concerned; but Unity requires C#, which is both a blessing and a curse. C# is managed code, which makes it much harder to slip and have a memory leak. It’s also a mostly1 well-maintained language and is relatively easy to learn, without being abstracted to a crippling point.

That is, unless you’re doing something very close to the hardware, or very computationally intensive. The truth is, managing code comes with a price. If I were using C or Assembly, my code would be moving (literally) a hundred times faster (in most cases). Breakneck speeds through a lattice of severe safety pitfalls. That’s not usually a big deal, as modern processors are blisteringly fast and programming is not an easy skill—sometimes it’s more economical to have the job done fast than it is to have the completed job run in a fraction of the time, especially if you’re only saving milliseconds.

It’s my personal opinion that object-oriented programming (OOP), for all its legitimate benefits elsewhere, has contributed exceedingly little to the multimedia (and game development) scene. It has a tendency to turn people into champion office-software programmers, when loop-driven programs like games follow a completely different set of rules. Object pooling, as an example, and data-driven code, become foreign concepts, when they can make all of the difference in the world!

So, enter ECS. ECS stands for the Entity Component System, which is, I think, a very vague and obfuscated name for it. It involves data-driven development instead of object-oriented programming. Enter the Unity Jobs system, which we’ll be discussing here. Also enter the Burst Compiler, an equally poorly (but at least fashionably) named extension to Unity. All of them get around these obstructions in some way, without having to part from the comfort of C#—at least, no more than is strictly necessary.

Jobs bring in a specific new feature. Unity is a single-core program, in other words, it uses only a single processor thread. This isn’t unique to it, the vast majority of programs are single-core. You may be working with a 4.0 GHz Ryzen processor with sixteen threads, but an individual program usually only uses one. Exceptions typically include web browsers—Chrome tries to keep a different thread for each tab, as I understand it—and programs which are attempting to perform many tasks simultaneously, like some renderers.

The drawback of this is that, in classic Unity, every line of code you type ends up being queued on the same thread, henceforth the “main thread”. You can’t run more than one instruction at once, you’re just running them blisteringly fast. Since we’re usually still going around 120 frames per second, that isn’t typically a problem, until you have an extreme operation that does require more time than the main thread can spare.

This is where the job system enters. Jobs, unlike their ancestral coroutines, can run on their own thread. When accelerated with the Burst compiler, they can run exceedingly fast. (Remember that hundred-fold factor I was talking about? Here it is.) Additionally, unlike when using C, you can readily compile to any platform supported by Unity with minimal work.

So the naive thought from here is, why don’t we use it on everything? Well, therein lies the catch—only certain data types can be used, specifically blittable data types. Blitting is short for block transferring (just as bit was short for binary digit—programmers love these things). It means that a streamed chunk of data of this type can be copied, en mass, from lower memory to processor caches. Since, compared to the speed of your processor, your memory is wrapped in architectural red tape and moving at a snail’s pace, this is critical stuff.

Non-blittable types are typically pointers to locations in memory where the actual data is. Sometimes, they’re pointers to lists of pointers. There’s a lot of he-said-she-said reference tracing when using them, but they open up a lot of new possibilities like OOP in their use; and I’m not bad-mouthing OOP, I’m just saying where it’s out of place. If you only need a few of them, then that’s not a problem. If you need a bunch of them, even hundreds or thousands or even millions, then it’s going to cripple your operation.

Blittable types are, in the context of C#, defined as types that have the same representation in managed and unmanaged memory. The program’s marshaller, or the module that converts types between managed and unmanaged memory, does not touch them. That’s about as close to the hardware as I’ve seen anyone get with C#. It also means that they can be block-transferred to a new thread.

Threads, by design, cannot know what each other are doing. This is part of where the speed-up from using multiple threads comes from. Because of this, following reference (non-blittable) types becomes very difficult. By ensuring that every type is a blittable type, and all are present on the thread before it runs, we avoid this; in fact it’s required for many parts of the Unity Job System and Burst Compiler.

Blittable Types List

Additionally, there are functions which cannot be called from these jobs, some of which are surprisingly useful to have around. While many, such as Monobehaviour, are unsurprising and not that impeding, we unfortunately also lose access to virtually all parts of the UnityEngine namespace, including Random.

In my case, I was attempting to seed the random number generator with a coordinate location’s hash value, and then acquire feature points from it. Since that is no longer possible, we can follow the easiest and most controllable solution and simply roll our own LCG (linear congruential generator), a type of random number generator!

Inside your job—and that’s important, it needs to be local to it—feel free to insert this function and definition.

private const uint RAND_MAX = ((1U << 31) - 1);
uint LcgSeed;

private float rand()
	return (float)(LcgSeed = (LcgSeed * 1103515245 + 12345) & RAND_MAX)/RAND_MAX;

This is technically a word-for-world (aside from C# translation) copy of the BSD random number generator, which is an LCG where multiplier a = 1103515245, increment c = 12345, and modulus m = 231. It’s converted to a floating-point (Single) and then divided by its modulus so we can get a value between zero and one, which is what I needed. It’s a BSD license and is very well known.

Let’s break down that return line for clarity, as it’s a little complicated.

We’ll start with LcgSeed = . That is an assignment, but the assignment operator has a useful side effect of returning the assigned value. We need it, because we want LcgSeed to update with each random value, to prevent repeats. What it’s set to be equal to, (LcgSeed * 1103515245 + 12345) & RAND_MAX, is the basic LCG equation with parameters from BSD. It’s finally divided by its maximum possible value in an effort to limit it to, for me, the unit cube; but there are many occasions when a random value between zero and unity is of value!

Replace your typical InitState call, which is part of the UnityEngine package and inaccessible from the job, with a simple assignment to LcgSeed. Since I’m generating vectors which ideally need to be identical for each call at adjacent locations, I just use the Vector3’s hash value; but that’s much more complicated than the rest of this and will likely be addressed in a book, maybe cursorily on the website.

Remember that you will need to initialize LcgSeed before you call it, as another side effect of IJob, on account of it being a struct instead of a class, is that it can’t have initializers in field declarations. However, since in many cases we don’t want a truly random value so much as a one-way hash from a location, this is almost an advantage.

Replace each Job-internal call to Random.value with rand(), or your variation on it, and you will do just fine for both performance and noise generation on an IJob.

1There’s some concern about dirtying the language by introducing SQL syntax, which also bothers me a bit; but this is optional and relatively small.

Brownian Noise

When we attempt to model a physical phenomenon for entertainment (or simulation) purposes, it’s less a question of perfectly fitting the real-world process that creates it, and more a question of getting close enough for the communication we intend. This is certainly true for Brownian motion.

For those who aren’t familiar, Brownian motion is the wobbly oscillation we tend to see in falling snow, or dust motes floating through the air. They repel each other slightly, and carry a form of influence throughout the entire medium; but otherwise act independently. Other great examples include cloud diffusion, or sediment flow.

Obviously this is all over the place in modern games, but if we were to simulate it exactly, we would have serious issues. We would need to keep track of the properties of a crippling number of particles, along with their influence on neighboring particles. This is possible in theory, but not readily in practice.

So instead, we take on what’s called a Lorentzian model (as opposed to Newtonian, which would be keeping track of each individual particle), and just consider the net flow!

I’m going to go over some modern C code for what’s called Fractional (or Fractal) Brownian Motion, fBm for short, here. Then, I’m going to explain it, bit by bit. The only thing I’m going to leave in the abstract is the noise function, which can be any random number generator of your choice; an LCG, /dev/random, whatever you would like.

Every modern language I know of seems to have at least one LCG random function built into its libraries, so this shouldn’t be a problem for you; just understand that noise(p) takes the point p as a parameter and uses it to generate a difficult-to-predict value.

(If you must generate one, and all you have is an LCG expecting a scalar number, I suggest simply multiplying each dimension by some factor, running it through the same LCG with its value as the seed, and popping the result into a return vector. It doesn’t have to be perfect, just convincing.)

We’ll need a handful of parameters to get this to work in a customizable fashion. Aside from the point we’re trying to generate this noise at, we’ll also want the fractal increment, the lacunarity, and the number of noise octaves. That probably threw many readers for a loop, as it isn’t at all obvious what those mean; so let’s discuss them in turn.


Fractal Brownian motion is technically not one layer of noise, but several, summed, and usually normalized (which just means spanned from zero to one). Each component noise is a wave of a specific frequency.

If all of the waves were of identical frequency, we would lose the character of the fBm, and just have what equated to a scaled noise pattern. So, with each component wave, we need to increase the frequency by a certain amount.

“Octave” is a term most are more familiar with musically—the number after a letter note, like C6, or D7, is its relative octave on a piano. The catch is, a difference of one octave is a doubling of frequency—as middle C is 262 Hz, the C one octave above it (C5) is 524 Hz. The same principle applies for our octaves in noise.

(In fact, you actually can tie this to something like an LV2 and create some weird, by which I mean awesome, sound effects; but that’s much harder to demonstrate on a web page, so we’ll be sticking with graphics for now!)

The more octaves you add to your noise function, the more detailed and winding the result is. But, this brings us to our next point. We could just assume that each layer of noise is an octave higher in its sampling, but that does feel a little limiting. What if it was half an octave? Or two octaves? This has a noticeable affect on the form of the noise, so we’ll want it to be adjustable. This is the…


So, if we wanted to go purely by octave, we would simply use a frequency equal to 2x, where x is some integer. We might have a few constants and factors in there to muddy things up to our liking, but this would be the core of it. While there are plenty of arguments for simplicity’s sake for sticking with 2x styled values, when we aren’t necessarily going for simplicity, we aren’t beholden to it at all.

I’m going to mathematically refer to that two number as L. It’s our lacunarity, and it doesn’t have to be two. Lacunarity is an interesting term for it, as it’s actually derived from the Latin lacuna which means “lake”. As lacunarity increases, the size of the gaps in our fractal increase in turn, so it’s less mathematically defined as ontogenetically defined—that is, how it looks, not what it is.

(In fact, lacunarity actually has a very context-dependent definition; in the study of fractals, fractals are often classed by a lacunarity property which can be geometrically measured, with something like the box counting algorithm. This is not necessarily always related to our use of it here.)

The last element to look at, given our understanding of the need for multiple octaves, lacunarity, and a noise generator, is what’s going on in x. There’s nothing saying that it has to be an integer, or even a typical flat value. This ends up being critical for fBm, too. We refer to x with the…

Fractal Increment

Or, for short, H. Generally, in our octave multiplier Lx, x=-H × octave number. So, where i is an integer in the range of zero to the number of octaves, each noise value is multiplied by an amplifier of L-Hi.

(I believe the choice for H for the fractal increment relates to its similarity to the fractal Hausdorff dimension, but can’t be certain of this at the moment. We’ll use it out of tradition.)

The rationale behind the negative is to avoid the typically inconvenient issue of a positive power, as these don’t converge to zero at infinity. I won’t lie, sometimes that’s actually what we want; but usually we don’t and it leads to very strange behaviors. Feel free to set your increment to a negative number, giving us a positive exponent, if you want to see the result!

This gives us quite a few elements which we can slide over for the sake of controlling our Brownian noise. Let’s look at our final math for it.

Final Mathematics

So, we’re iterating up to k octaves. For each octave, we will be adjusting our sample point, conveniently by multiplying it by the lacunarity. We’ll acquire noise from that sample point, and then multiply it by a factor of L-iH, where i is our current octave number. That is, we’ll be doing this:

Don’t worry if that looks scary, we’ll go over it in basic procedural code. It isn’t so bad.

We’ll start with something easy to understand, but not quite perfect, and work from there.

#include <math.h>

double fBm( Vector point, double H, double lacunarity, double octaves )
    double value = 0.0;
    for (int i = 0; i < octaves; i++) {
        value += Noise( point ) * pow( lacunarity, -H*i );
        point *= lacunarity;
    return value;

(Again, remember that I’ve not defined what Noise is here; that’s just a function that takes a vector value and returns a random float. There are oodles to choose from, so pick what’s most convenient to you, in your environment. In fact, it could arguably be any function period, but the squirrellier the results, the better.)

Let’s discuss it in detail. We take a basic Vector structure called point, and iterate over octaves with it. For each iteration, we add noise, seeded with our current point, to our final value. But, first we multiply it by our lacunarity to the power of negative increment times the octave number.

That amplitude multiplier of L-iH ensures that each octave is smaller than the previous octaves in magnitude. Remember that if this were not negative, each value might be successively larger. There are uses for that, but they aren’t typical and aren’t usually useful.

Then, to take care of the variation of our sample point with each octave, we multiply it by lacunarity, so each call to noise will return a different value.

(If you would like to test the behavior of this function and don’t have a Vector class or struct handy, feel free to just use a double for it and watch the behavior of the resulting values over an iteration. You’ll just be seeing it in one dimension, but it will all still be there.)

We aren’t quite done, though. What if we have a non-integer number of octaves? Currently, the behavior of the function is floored to the integer immediately beneath the specified number of octaves. Thankfully, this is relatively simple to fix.

double fBm( Vector point, double H, double lacunarity, double octaves )
    double value = 0.0;
    int i;
    for (i=0; i<octaves; i++) {
        value += Noise( point ) * pow( lacunarity, -H*i );
        point *= lacunarity;
    double remainder = octaves - (int)octaves;
    value += remainder * Noise( point ) * pow( lacunarity, -H*i );
    return value;

Let’s discuss my boldfaced additions here. I’ve declared i in the next-higher scope, because we’re going to need to refer back to it.

We get the fractional number of octaves by subtracting the integer value, which is traditionally the floor value, of octaves from it. (Some languages have a frac function that takes care of this, but this is usually what it amounts to internally.)

We then add this remainder value, multiplied by noise on our current reference point (and I remind you, this is different from the last one, as it’s already been multiplied by lacunarity), multiplied by L-Hi, to our final value.

Technically, this is it—you’ve got unbounded fractal Brownian motion. Dependent on your noise function, this could be a set of continuous vectors, a cloud of densities, or whatever you would like; the fBm part is primarily in the octaves method. However, if you’re going for something like densities, you may want to bound this value to [0, 1], so you can properly display it on a monitor.

How you choose to bound it likely depends on what you’re doing with it, but by dividing by the sum of maximum possible values, we can ensure that it’s always between zero and one.

double fBm( Vector point, double H, double lacunarity, double octaves )
    double value = 0.0;
    double sum = 0.0;
    int i;
    for (i=0; i<octaves; i++) {
        value += Noise( point ) * pow( lacunarity, -H*i );
        sum += pow(lacunarity, -H*i);
        point *= lacunarity;
    double remainder = octaves - (int)octaves;
    value += remainder * Noise( point ) * pow( lacunarity, -H*i );
    sum += remainder * pow(lacunarity, -H*i);
    value /= sum;
    return value;

Of course, you may just want it centered around the median or mean, which would involve keeping track of maximums and minimums for each value, but this is easily done with another variables and a comparison operator. It really comes down to the end user!

I would like to thank the F. Kenton Musgrave for introducing me to this function on a technical level. Most of the changes to it are largely cosmetic, and the chapter is a great read in any case. Musgrave, F. Kenton, (2002) Texturing and Modeling — A Procedural Approach, Third Edition, An Introduction to Fractals, pg. 429-445

The Results (in OSL)

I’ve rewritten this function in OSL (Open Shading Language) and implemented it in Cycles, to demonstrate its results. OSL conveniently includes a noise function which accepts any form of vector or color as a parameter. The final material is thus:

#include <stdosl.h>

shader fBM(vector uv = 0, float octaves = 1, float lacunarity = 2, float H = 1, output float hue = 0)
    float value, remainder;
    int i;
    value = 0.0;
    vector p = uv;
    for(i = 0; i < octaves; i++) {
        //*additive* cascade
        value += noise(p) * pow(lacunarity, -H * i);
        p *= lacunarity;
    remainder = octaves - (int)octaves;
    if(remainder) {
        value += remainder * noise(p) * pow(lacunarity, -H * i);
    hue = value;

It’s likely that, if you’re familiar with C, you understand most of this too. I’ll go over the minor differences—each parameter to the shader has to have a default value, which is listed after the parameter with an equals sign and then its respective value. The output parameter just lists an output field from our shader, which is later plugged into an HSV node to produce a color.

I haven’t bothered to normalize it in script, as hue is circular anyway and will normalize itself. (Processor cycles are precious, after all!)

If we set UV to the object coordinate, octaves to nine, lacunarity to a basic two, and H to 0.5, we get this:

fBm demo’d over a hue space

Drop our lacunarity to one, and we have this:

A simple drop of lacunarity can dramatically change the effect!

If we drop our increment, H, to 0.1, we get a much noisier image.

Dropping our increment increases the frequency of change dramatically.

To demonstrate why our increment is multiplied by a negative, let’s drop it to a negative value itself to cancel it out. If we drop from 0.5 to -0.5, we end up with unbounded values, which while it sounds cool ultimately just gives us grey noise of infinite frequency.

Believe it or not, this is extremely fine color noise from a negative increment.

I promise that there are uses for this kind of behavior, but as you might guess, there are precious few.

Lastly, dropping the number of octaves simply removes fine detail from the fBm, and gives us a much smoother cloud that follows the same pattern.

Fewer octaves intuitively results in more regularity.

Hopefully, you have an idea how fractal Brownian motion actually works now. You can alter its input parameters to skew it in different ways, or apply it to any number of values. If we wanted a normalized gradient field, as an example, we could simply pipe our “hue” into the polar coordinates of the vector. It’s applicable in any number of dimensions—which is part of why I’m not picking what noise is for you—and can provide a convincing emulation of any number of real-world fields.

The Myth of Noise

Let’s step back for a moment.

There’s an ongoing problem among engineers today, and it’s an understandable problem. We like to think that our actions are always based on the foresight of science, but the truth is, operationally, we’re very independent of the empirical method. For us, it just has to look like it works; the semantics of how, while legitimately interesting, are not bringing our paychecks any closer.

This is… acceptable, though flawed. Engineers are much closer to artists than scientific researchers; we create. We love science, but we don’t always have all the answers, and a lot of the time, neither does everyone else. What’s worse, sometimes having scientific correctness is prohibitively expensive compared to having a passable result.

Not that scientific programming can’t be a thrill in itself…

So, that brings me to “noise”. There is a persistent myth among software engineers that noise is truly random, or if not that it is truly random, the notion that it can be truly random. Both are amateurish bull-pucky. Noise is extremely comprehensible, and I dare say it even feels like a cheat.

Anyone who attempts to generate random numbers by deterministic means is, of course, living in a state of sin.

John Von Neumann

Computers are entirely deterministic by design. Any function can be broken down into a collection of pure functions, for which the same input will always result in the same output.

We do have <random.h> and any number of other psuedorandom—and that psuedo– is the important part—number generators, though. So how do they work? There are a lot of ways that they can work, but I’m going to start with the most common, the linear congruential generator, or LCG.

LCGs require only a seed value, which, in entertainment, is typically the time. They have an internal state, after receiving that seed value, every value returned by them is a new and reasonably difficult to predict—at least for entertainment purposes—number. How they operate internally is painfully (and arguably beautifully) simple.

You see, every single LCG can be broken down to the same formula:

X_{n+1}=(a X_n + c) mod m
The LCG Formula

I get that this may look complicated to some, but it’s blissfully not. Borrowing LaTeX notation, which should be clear enough to most, X_{n+1} is our returned value, that’s the new random number X_n, on the other hand, is the last random number generated. That is, it is the internal state.

It might also be the seed value.

The remaining important factors are a, c, and m, namely the multiplier, increment, and modulus. For some, that mod function might look a little weird, but it’s just the modulus function, which is very similar to a remainder. (There are a few minor differences involving negative numbers, but I’m not going to bore you with the details here. Maybe in another blog post someday.)

To clarify, if A is three and B is two, then A mod B would be one— because three divided by two is one, remainder one. That’s almost all that modulus is, which is thankfully very easy for machines to do.

If we take this, and set our multiplier a to 214,013, and increment c to 2,531,011, and lastly our modulus m to 2³², then we’ve got the equivalent of Visual C++’s random number generator. There are plenty of alternatives to this, too. Most of them are even listed on Wikipedia—this knowledge isn’t as arcane as people like to think.

Incidentally, this is a huge problem for cryptography, as time itself is not random. It’s quite predictable. So, you wouldn’t want a system limited to an LCG to encode, say, bank account information, if it’s seeded with the time. Thankfully this isn’t a problem for entertainment purposes, as it’s still virtually impossible to take a small set of results and back-calculate the formula that produced them.

Additionally, it is very easy to pick a bad set of factors for an LCG. There’s a lot of wonderful math behind them, which for space constraints I’m not going to go into here; but any time you’re looking at a cheap algorithm for generating white noise, you’re likely to be using an LCG.

Feel free to implement one in Python right now—it isn’t any harder than it looks.

For companies and devices that need “true” random numbers, it’s typical for an external unpredictable phenomenon to be used. I’m hesitant to use the word “random” here, as a guy who got his beginning in physics, as our universe, complicated as it may be, is still deterministic.

To illustrate this, the company Cloudflare famously uses a video of a wall of lava lamps to seed its encryption, as they’re much more difficult to predict. For *nix-based computers (really anything using Udev), there’s typically a cache of real “random” data pulled from system temperature, keystroke frequency, and any number of other elements outside of system control, which can be used to generate a cryptographically secure value.

While these elements are virtually impossible to determine from the outside, they are the result of physical (albeit often chaotic) processes, with their own impulses and mechanisms controlling them. (Contrary to popular belief, even quantum mechanics is not truly “random”; it simply fragments perception of values into unknowable facets; there are still rules.)

So what do we really mean by “random”? Most commonly, we’re looking for data which is difficult to predict. A good LCG does this. For cases where LCGs aren’t enough, there are a number of derivatives, including inversive congruential generators (where the multiplier is divided by the previous result), and permuted contruential generators (which multiply the result by a curve which makes them better-behaved statistically), among others.

Like so many things, it ultimately comes down to what you’re trying to do with it, and what will be passable for your purposes. You are unlikely to ever experience true noise, only mystery!

Tiamat Preview (Arguably NSFW)

Tiamat, Take One. Imagine her seventeen feet tall.

Allow me to start by saying that if this image is not safe for your workplace, your job sucks.

That out of the way, I’ve been working on an Enuma-Elish-Accurate-ish depiction of Tiamat. I know Tiamat is typically rendered as a dragon, but nothing ever said that she was one. She had “children” that were dragons, but she had children which were a lot of different things, and this is ancient Sumerian to begin with so God only knows how right we are about its meaning. We understand, like, one percent of that language.

But hey, far be it from me to interfere with a steady-going game of AD&D on a technicality.

And yes, she’s got eight breasts. She was described as having an “udder”. I could not get that to work with any kind of class, so I Total Recalled this thing. In science fiction, people always appreciate supernumerary breasts; but no one ever wants to admit it. This also counts against the dragon theory, as, while she was enormous by all depictions, you cannot milk a lizard or an amphibian. I think we can put that theory to rest.

I did her husband, Absu, some time ago. The theory was that they were the ancient gods of fresh water (Absu) and salt water (Tiamat), who created life between them, and kind of accidentally ended up with humans too, whom they might not hold that against, but ultimately did not care for. It was a bit of a problem after a while… the challenge was to create something scary and weird, but also sexy and cool.

It couldn’t be scary as in “deformed freak”, but rather, the kind of thing that makes us feel like we’re the deformed freaks. I have a lot of work left to do—I like how her outfit is coming along, but it needs more detail. The Lovecraft Gold/Orichalcum body jewelry has a nice feel to it as well.

I also need to refit basically all of the major shape keys, correct the armature for animation (and add anything at all, along with root bone sensitivity, to the tail), so it’s far from done. I’ll probably touch it all up next weekend; I need to get back to Realtorgeist. When it’s done, if I can see a decent pose for it, I might turn it into new knick-knack art.

Hope everybody else out there had a wonderful and productive weekend. Peace.

New Trinket Design


So today I was grinding at the same algorithm I’ve been busting at for a week, one which worked on paper but continues to demonstrate problematic behavior when implemented, and I finally had to take a breath.

Let me say, when you’re building an algorithm yourself, but cannot manage to keep track of what your code is doing or understand why it’s behaving poorly, it means two things. The first is that you may need to take a hike—literally go for a run and get some exercise—and then come back and look at it, from the beginning, with fresh eyes. There’s a good chance the problem is fundamental, and knowing this will help you get around it by applying different business logic to it.

The second thing to note is that you’re probably experiencing fatigue. I definitely am in the midst of some serious programmer’s fatigue. This is a very real thing, and if neglected (or band-aid medicated with something like excessive caffeine) it can turn into burnout, which can take years to recover from.

So, I’m breaking for the weekend and working on some artwork. Above is an idea that occurred to me on that very walk. You can get it on just about whatever you could possibly like, any kind of gadget or doohickey people personalize, right now, at Red Bubble. Do me a solid and buy one!

On my plate right now, I have two comics, a game which has gotten outstanding reviews in its pre-alpha, and a short film (arguably two) to work on, and the future is looking bright for all of them. I refuse to burn out now and am switching it up for a bit.

This weekend’s challenge is to render my vision of Tiamat, who was never precisely described in the Babylonian Enuma Elish. This is likely because of just how alien she was to our concepts. I know people usually think she was some kind of dragon, as some of her “children” were, but one, her children were a lot of different things; and two, she is plainly described as having an udder. You can’t milk a lizard. So, I’m getting creative with this one.

It may make a decent design in itself. We’ll see what happens. Hopefully in a couple of weeks I’ll have something final to put up here.

ArtStation Plus

The Frontier Medicine Entertainment Art Station Page

So, as in love as I am with their videos and lectures, I’ve decided to upgrade my account on I will, with no disrespect to my current setup, admit that depending on how easy it is to edit that blog, I may ultimately move off of WordPress–which has, otherwise, been great.

This should allow me to not only continue to improve upon my skill set, it should also promote some of my work automatically; which is never a bad thing in my business. Whether it ultimately subsumes my WordPress stuff, it’s hard to say just yet. I may keep this page, for some time, as a non-premium item.

In other news, I have been working for some time on a demonstration of how to model clothing, in Blender, without using the cloth modifier. I’ve known for some time that, as magical as the cloth modifier seems to be, there isn’t much that it does outside of animation that can’t be done on its own; and I’ve noticed it tends to backfire rather dramatically. Think of it less as a lecture, and more as a lab.

I should be finishing everything up through the shirt today; with boots, gloves, hair, and basic animation coming later. (I seem to have botched the audio recording on one section, so that remains a task to fix; but soon enough it will all be up.) This may be condensed into a basic, and much shorter, tutorial later.

Announcing my Red Bubble Store!

You’ll find it here. Every now and then I’ll get a little mentally tired with whichever piece I’m working on, and I’ll just doodle for a bit. Every now and then, now and then, I’ll come up with something brilliant while doing it.

Time to turn them into t-shirts and mugs!

So I’ve got two items so far and probably a few more coming through the rest of the week—the one featured to the right of this paragraph is a Christmas tree made out of fractal constructs, which admittedly is the kind of thing you either get or you don’t… which also makes it funnier.

And to the left of this paragraph, we have my take on Spider-Man: No Way Home. (Also a play on “Magneto Was Right” / “Thanos Was Right” / “Mysterio Was Right”) I’ll give it a seven out of ten, it was a darned good movie; there were only a few things that bugged me and if we’re going to be realistic, they specifically bugged me, so I can’t complain. My wife and I had a lot of fun and I’m looking forward to Morbius later in the coming year.

Give it a look now and then!

Absu, Babylonian God of Fresh Waters

Here we have last summer’s rendering of the occupied throne of Absu, one of the antagonists of the Enuma Elish (“When on High”), the Babylonian creation myth. I’ve always liked that one, but it’s safe to say I’m getting a little obsessed with the material at this point.

The interesting thing about that myth, aside from it being the oldest creation story we can date, is that they never assumed that they were first on the world. It was a world of monsters beforehand, and only after Marduk’s insurrection was it really a world of man. Absu (fresh water) and Tiamat (salt water) were creator gods, sure; but they honestly couldn’t care less about man.

As I read this stuff, I have all kinds of wonderful ideas. I think it’s time I brought it back as, at minimum, a graphic novel; if I can get a rendering farm and some quality models together, along with some voice actors, maybe even an animated series.

I’m going to return to Absu later, with a fresh sculpture of him. It was all done by hand, but there are still a few things bothering me—and yeah, I know, probably only me—quite a bit. There’s also reasonable composition to the scene but a total lack of video compositing, which would really fill it out.

I’ve also got some great ideas for Marduk, the hands-in-the-dirt protagonist and future God of Magic, and Ishtar, which should redefine sexy on the ground level. They’re going to take time too, though. This story is really drawing me in.