April 21, 2022

Trade VI: Simplifying to Complexity

I've been working on my algorithms to generate trade networks and manufactured goods. Although probably several years out of date now, Tao's Trade System remains required reading for anyone wanting to do something similar.

This iteration was a much-needed rework of the way a lot of the prices were generated. I had gone through Alexis' work linked above and simplified it to my needs. He is using references based in the real world, whereas I am trying to generate from scratch.

I found that of all the equations, the final price of an object simplified to just a few variables. The first is the local value of a unit (oz) of gold (in cp, all prices are expressed in cp and can then be abstracted back up to sp or gp as needed). I'll call this $g$. To find this number, we need the local availability $g_\ell$ and the total availability $g_t$. $g_t$ is the total number of references reachable from the point in question. This is easy to do with network algorithms. Local availability is a weighted distance calculation over all locations $i$ reachable from $\ell$: \[g_\ell = \sum_i \frac{g_i}{\textrm{dist}\left(\ell, i\right) + 1}\] By this calculation, if $i$ has 2 gold references, but is 4 days away, it contributes $\frac{2}{4+1} = 0.4$ references to $g_\ell$.

Right now, I am treating each hex as a node in the graph, but if there is no defined settlement there, all its resources are given to the closest city hex for pricing purposes.

We need a rarity factor $r$ that scales with the size of the network. As the network grows larger, a smaller $r$ is needed to balance things out. I'm trying this out for now: \[r = -\frac{1}{n}\] where $n$ is the number of nodes in the network.

The last two constants are the number of gold pieces per oz ($p = \frac{1\textrm{ gp}}{0.48\textrm{ oz}}$) and ratio of cp to gp ($c = \frac{100}{1}$). These are easy to change. A half-ounce gold piece is somewhat hefty; many coins in history would have been much smaller amounts. Gold is valuable enough that a small bit is worth a lot, and hence a great deal of value can be expressed in quite tiny coins. I like the idea of a gp being a weightier coin. I also try to base the sizes of my coins on current analogs that I can actually show to my players.

A thousand of these is no joke

So the final equation comes together: \[g = \left(r \cdot \frac{g_t}{g_\ell} + 1\right) \cdot p \cdot c \]

For an individual resource $q$, the equation is similar. First, we have to define the base cost $b$ of a unit of $q$. I found that this was the most important factor; it essentially represents the ideal economy where everything is in perfect supply. Whether a board-foot of wood is defined as 1 cp or 18 cp will have approximately a 18x effect on the final price of wood no matter where you are in the world. This becomes the key object of research when sketching out the system. It is then an easy matter to determine how many units of $q$ are equivalent to one reference of gold (assuming a gold ref = 1500 oz): \[\mathit{ref}_q = \frac{b}{\mathit{ref}_g = 1500\textrm{ oz}}\]

We obtain the rarity $r_q$ in a similar way as above with gold, using the distance weighted availability. The final price (in cp) of an item at location $\ell$ is then: \[\$_q = \frac{g}{\mathit{ref}_q} \cdot \left(r \cdot \frac{q_t}{q_\ell} + 1\right) \mathit{ref}_g\]

There are some other ways to view this equation. It can simplify again to: \[\$_q = \frac{g}{b} \cdot \left(r \cdot \frac{q_t}{q_\ell} + 1\right)\]

The next step is to determine the availability of labor references. I haven't quite decided how to assign these so I'll save that for a future post. We obtain the available references by once again iterating on the network: \[L_\ell = \sum_i \frac{L_i}{\textrm{dist}\left(\ell, i\right) + 1}\]

The cost of a material $m$ at a given stage (eg, hematite $\to$ iron ore) is then the cost of the raw materials (the unit cost $\$_m$ times the number of units $n_m$) plus the labor cost, which is raw material cost divided by the labor references: \[\$_m = \$_q \cdot n_m + \frac{\$_q \cdot n_m}{\mathit{ref}_L}\]

This step is repeated for each stage of the process, which can be quite complex. I developed a JSON schema to represent each manufactured material. To raise an auroch from a calf to weaned, you require the following.

{
  "item": "auroch (weaned)",
  "unit": "hd",
  "stage": 1,
  "tech": 7,
  "weight": 200,
  "recipe": {
    "materials": {
      "auroch (calf)": 1,
      "min": [
        {
          "maize": 483,
          "oats": 483,
          "barley": 483,
          "cassava": 483,
          "rice": 483,
          "wheat": 483
        }
      ]
    },
    "labor": "herdsman"
  }
}

Our raw materials are 1 auroch calf + whichever is cheaper between 483 lbs of feed, plus the labor of a herdsman. As long as every manufactured item "downstream" exists, this "item" will be available for purchase.

April 8, 2022

Resources XXIII: Crops and Climates

I've recently completed a little side project that will help resource placement significantly.

I cross-referenced the crop yields from EarthStat with Koppen climate data to get the prevalence for 175 different categories. The data isn't perfect - it doesn't take politics or demographics into account, for example - but I think it will be useful either in automation (as I plan to use it) or in beginning to think about where crops should be placed on a world map.

More details and the data itself can be found here in its own git repo.

March 24, 2022

Detail XIV: Benefits

I have put the world project on hold for a bit to get refreshed with other work.

Over at the Tao, Alexis has been reworking his hammers/coins/food system. It's a brilliant system with endless opportunity.

My own use of programmatic aids lets me break a 20-mile hex down into not just 6 6-mile hexes, but 400 1-mile hexes. This is a lot of detail; not something to be done by hand. And I do think the 20$\to$6$\to$2 system generates very beautiful results. I want to consider working on this for a while to see where it leads. By saving the random seeds that the model is built from, I can replicate a nice result without storing too much data.

This was the result from last year:

Last year's detailed hex

My latest version has some upgrades with color, but also additions of hex-type and benefits (hammers/coins/food). One major change is that the scale will be much different than the Higher Path: with many more discrete points, the number of benefits will be correspondingly higher.

An example (not of the same hex as above, I couldn't find the old version of this one) of a 20hex with an infrastructure of only 20. Yet, due to its presence in a forest, with some substantial clustering of hexes (producing a large number of high hex types), it generates a total of 134 hammers, 168 food, and 78 coins, calculated for each 1hex according to a binary system such that $B = 2 ^ {b - 1} + 1$, then added for the whole 20hex. This doesn't yet consider any benefits that would be added for the presence of a trade reference.

Updated hex

There are two paths forward. The first is to adjust the benefits conferred by each hex type such that the numbers become more reasonable. The 20hex total may not be as important as the individual 1hex number: eg, the settlement hex generates {'hammers': 5, 'food': 5, 'coins': 4} in the binary system, or {'hammers': 15, 'food': 5, 'coins': 4} in the decimal system. Without context, it is impossible to determine what constitutes a "reasonable" number. The second is to adjust the results of those benefits.

If a system is to work properly, it has to generate desirable and consistent results. The advantage of doing this programmatically is that I can implement totally different systems at the drop of a hat. The design loop is much tighter than hand calculation (although I am trying to avoid something that is so opaque it can't be replicated by hand).

March 4, 2022

Ex Nihilo VIII: Pressure

Pressure is easily defined based on latitude and the presence of landforms. The rules are defined based on posts from here.

July
January

These parts aren't very exciting. But they're helpful as a public log of my progress.

February 18, 2022

Ex Nihilo VII: Currents

Once the coastline is clearly defined, currents can be determined. I've detailed that process elsewhere, and these are the results for the map we're working with. The blue-white scale indicates the angle of the current in each cell from 0 to 360. There's probably a better way I can show this but for now this'll do.

This algorithm takes a really long time to run, so I think I'll accept these results as is, unless some hidden issue rears its head along the line.

February 15, 2022

Ex Nihilo VI: Tectonics III

Continuing on with the work of generating terrain from the tectonics. My first fresh crack uses both tectonic uplift and droplet erosion. These processes are cycled over and over until some condition is met. In this case, I start from a flat plane and stop when at least one cell is at its maximum height (25599 ft). The sea level is determined such that 29% of the total surface is land. I also apply a hypsometric curve to the land area so that higher elevations appear in roughly the same proportions that they do on our earth.

However, this algorithm generates a pretty boring topography. The mountains slope up uniformly from the coast, and the tectonic uniquenesses are not presevered. Usable, but disappointing.

The problem is that the erosion cycle is too powerful as the terrain grows from zero, and only the center of the continents (that is, the areas with the least erosion) have any chance of growing at all. To fix this, I began with a terrain generated directly from the relative uplift scaled to max height. The warping effects are clearly visible here but these will be smoothed out by the erosion algorithm.

Next, I again cycle through the erosion, but this time I rescale the height at the end of each cycle and apply the same hypsometric distribution. This yields a much more interesting topography.

The grayscale map is a bit hard to parse, so I threw the map into GIMP and applied a simple colormap. where 10,000 ft begins to turn into gray/white.

There are still some issues I can see, or improvements that can be made. Coastal areas are pretty uniformly low for many hexes inland (no Chilean Andes). Most areas do not have significant mountain ranges, although there is one range similar in size to the Tibetian plateau.

But this is the process. Design, test, repeat. This will be good for now and I'll move on to some other elements of the climate system.

February 4, 2022

Ex Nihilo V: Tectonics II

In the previous iteration, the tectonic plates were generated from a Voronoi algorithm, and thus had very straight edges. Here, I've made those more jagged, which should result in a more interesting coastal topography.

And it does! There is still a good deal of far-too-straight lines, but we can live with some of that. The ripple algorithm, which I'll tackle next, should help.

A bit of Gaussian blur:

Next, to closer approximate real-world distributions, I'll apply my hypsometric scaling.

Lastly, to get a feel for how this will shake out, I use the uplift values to generate a quick and dirty initial altitude map. The coastlines are a bit blobby, but I'm happy with this part of the process.

Getting the ripple algorithm to work will be my next order of business.

January 18, 2022

Ex Nihilo IV: Tectonics I

My previous work involved manually drawing out the tectonic boundaries. With the advantage of distance, I'm no longer married to that concept. Instead, we can generate them from scratch.

To approximate an irregular but blue-noised grid, I'll grab a Poisson Disk Sample from all points on the map. This is a pretty useful algorithm to know, so it's a good chance to rewrite it to be a bit more efficient. After generating the points, tectonic plates are generated as a Voronoi map with the Poisson Disks as the centers (the shapes it generates are too uniform, so I'll revisit that later). A Poisson radius of 100 hexes yields 36 centers and the following map. Oceanic plates (approximately % of the total area, close to Earth) are shown in a lighter shade.

Each plate is assigned a random Euler pole and angular rate of rotation. I can use this to find the strength of collision at each boundary. I'm not super satisfied with the equations I'm using but they can always be modified.

Next, the rate of tectonic uplift for a given hex is determined by each fault's effect on that hex, with a bonus for continental plates (which are lighter and tend to "float"). High uplift values will generate mountains and island chains.

In the past I've played around with various combinations of Perlin noise, domain warping, and other forms of distortion applied to the uplift map to get interesting topographies. The final topography is heavily dependent on the uplift value: the droplet model algorithm gives interesting local topography but is ultimately overpowered by the underlying uplift. So it's important to get it mostly right to begin with. That being said, I find it important not to get too caught up in fine details when there is underlying code to fix. And there is a lot of code to fix and refactor.

To get a rough idea of the kinds of terrain this will generate, we can mock up the elevation values from uplift (essentially scaling from 0 to our max height 25599) and apply a sea height of whatever makes the land percentage 29%. I changed the continents slightly from the plate image above.

Note how straight the lines are, something that will need to be fixed.

The elevation generator is quite slow (particularly given the size of the map), and so it may be a while before I have this round of kinks worked through. My basic algorithm is a droplet model which erodes land based on the tectonic uplift, water erosion, sediment deposition, and coastal erosion. In the past I have used the Wei-Zhou-Dong algorithm for depression filling (which makes all rivers flow to the sea), but I am not terribly pleased with it this time around. We'll see.