*Octas*' most notable feature is its basis in (zero-homed) 8-sided dice, working in a numeric base of 8 (octal), rather than the base-10 (decimal) that most contemporary humans are more familiar with.

As mentioned way back in the introduction, the ancient Romans were big fans of the decimal number 10. While early Rome tried *ever-so-hard* to make their military units conform to groups of 10x10 soldiers, they eventually ended up back at groups of around 60 soldiers as the largest practically-manageable Legion size. .... I guess they should have gone with base-8 for 8x8 = 64-soldier legions from the start after all!

Even Roman numerals would be simpler in octal, as you would be able to skip all the IV IX XC subtracting-prefix combinations, instead counting I II III V VI VII VIII X XI XII XIII XV XVI.... The reason for using the subtractive-prefix is presumably that going to 4 consecutive glyphs such as IIII or MMMMDCCCCLXXXXVIIII is just a bit too long-winded for writing (chiselling!) out efficiently, and is also pushing the normal limits of the human subitizing ability (at-a-glance sight-counting, which is usually hovering between 3 and 4 items depending on circumstances).

Not that the Ancient Romans really *chose* decimal, they inherited it from their Ancient-Grecian roots, who in turn likely got it from elsewhere earlier .... If only those long-forgotten ancient peoples had considered their thumbs not to be fingers, for counting purposes.... Again, not a realistic proposition because long before digit-placed counting was a thing, you could only (easily) count as high as you had appendages to count on, so the idea of ignoring two perfectly good digits for mathematical concepts that no-one was even aware of at the time is just wishful silliness.

There is (sketchy) evidence that Non-Mediterranean Europe actually

wasusing base-8 at around the dawn of modern human language, but a decimal culture obviously prevailed, probably thanks to Ancient Rome's dominance of the region around the time.Also some native-American cultures count

betweenthe digits of both hands, so naturally settled on base-8.

Of course, counting on your fingers (and thumbs), while probably the most obvious, isn't the only way we could have gone, as other numeric cultures from around the world show:

Counting on the knuckles (of one hand, fingers only, using your thumb to mark your place) takes you strait to 12 (a much better numeric base than 10 due to it being divisible by 2,3 and 4, which are things that we tend to do a lot, though *thirding* is not as common as halving or doubling, and is actually quite hard to do physically *by-eye* with any accuracy).

Explicitly starting at 0 with a clenched fist, and counting to 5 on the fingers and thumb of one hand gives you an entry point for (zero-homed) base-6 (also divisible by both 2 and 3), then since you have an explicit zero you can easily add in a second hand for a second digit (this is actually makes the leap to digit-place counting a little more natural) you get base-6² to easily count from 0 to 35 on the fingers+thumbs of both hands. This is possibly the easiest alternate development path to get to from single-digit-finger-counting to a digit-placement counting system.

There are also some esoteric counting systems that go from fingers, to thumb, to hand, to wrist, to various arm joints and manage to get as high as base-27 (sort of), though I feel the complexity of them out-weighs any advantage other than not having to invent multi-digit numerics for a bit longer.

At the other extreme you have societies that just never needed to develop anything more complex than *none-one-two-many*, but counting higher than 2 is pretty much essential once your civilisation leaves the hunter-gatherer stage and moves to agriculture, so such systems can't really be considered in a mathematical context (though they do often still serve very useful linguistics functions).

Base-10, is (in my *very* non-expert opinion) the second worse *even-number* base below 20 we could have ended up with! (base-14 beats it out mainly because we don't have that many fingers+thumbs, otherwise they are fairly comparable in issues, mathematically). Odd-number numeric bases tend to be generally problematic beyond the need to even consider them - division and multiplication by 2 are very very fundamental to how things get done in the real universe!

It is also worth noting that the conversion between binary and base-10 that computers do for our benefit is highly error-prone in some types of operation: Specifically, any fractional part of a non-integer number requires extra-special care, and for precision-critical uses such as physics and finance, special-function program libraries need to do a *lot* of extra work to avoid creeping loss of precision across lengthy or iterative calculations. The same is true for any numeric base that isn't a direct power of 2. Base-8, being 2³ is a direct conversion from binary that is simple, accurate and more than an order of magnitude more efficient in both time and energy. In a pre-technology society, I would probably default to base-6 or base-12. However in a society tightly integrated with binary computers, base-8 has so many advantages, even without the easy division by 3 that base-6 and base-12 provide.

Even base-8's no-integer-division-by-3 issue isn't as bad as it could be, as 8 being directly adjacent to 9, which is 3², filters back through the multiplication tables to neaten things up and generate easy-to-work-with patterns rather more than one might expect!

Base-10 does get a similar advantage via the same 9 value - probably its only

mathematicaladvantage over base-14, and why one third at least works out to the relatively memorable 0.33333....!

Octal is a quite nice numeric base, is what I'm saying.

...

I have also played around with base 4 which has the same 2ⁿ advantages as base-8 and a few extra pros and cons of its own, but ultimately found it unwieldy in the too-small direction as common-range numbers blew out to excessively long strings of digits quickly. Bases in the 6-12 range seem to be the human brain's sweet-spot for working with digits.

Don't mention hexadecimal (base-16)! It is very useful for making strings of binary numbers human-readable, as it packs perfectly into two-digits per 8-bit byte, which octal doesn't. But base-16 is far too big and unwieldy for human-mathematical use - even people who claim to be able to do base-16 maths in their head are usually converting to base-4 (a trivially easy operation between these two bases) doing the maths in that, then converting back. I may as well claim to be working in

big-brainbase-64 but doing the actual maths in simple octal! A certain 18th-century Swedish King would at least be impressed!