According to Wikipedia: The first modern, electronic ternary computer, Setun , was built in 1958 in the Soviet Union at the Moscow State University by Nikolay Brusentsov ,[4] [5] and it had notable advantages over the binary computers that eventually replaced it, such as lower electricity consumption and lower production cost.[4] In 1970 Brusentsov built an enhanced version of the computer, which he called Setun-70.[4] In the United States, the ternary computing emulator Ternac working on a binary machine was developed in 1973.[6] :22 The ternary computer QTC-1 was developed in Canada.[7] Doesn't seem like they caught on otherwise, though. Niklas Den ons 3 feb. 2021 kl 21:10 skrev Dave Horsfall : > On Wed, 3 Feb 2021, Peter Jeremy wrote: > > > I'm not sure that 16 (or any other 2^n) bits is that obvious up front. > > Does anyone know why the computer industry wound up standardising on > > 8-bit bytes? > > Best reason I can think of is System/360 with 8-bit EBCDIC (Ugh! Who said > that "J" should follow "I"?). I'm told that you could coerce it into > using ASCII, although I've never seen it. > > > Scientific computers were word-based and the number of bits in a word is > > more driven by the desired float range/precision. Commercial computers > > needed to support BCD numbers and typically 6-bit characters. ASCII > > (when it turned up) was 7 bits and so 8-bit characters wasted ⅛ of the > > storage. Minis tended to have shorter word sizes to minimise the amount > > of hardware. > > Why would you want to have a 7-bit symbol? Powers of two seem to be > natural on a binary machine (although there is a running joke that CDC > boxes has 7-1/2 bit bytes... > > I guess the real question is why did we move to binary machines at all; > were there ever any ternary machines? > > -- Dave