Path: news.nzbot.com!not-for-mail
From: Bradley Bungmunch <bradleybungmunch@yahoo.com >
Newsgroups: alt.fan.karl-malden.nose
Subject: Re: Soja's History Challenge - My Answer
Date: Fri, 01 Aug 2003 19:08:07 +0100
Organization: H0t w3t 69s!!!1!!
Lines: 105
Message-ID: <fralivkool54aql2rpph9bllevkuh8cum2@4ax.com>
References: <eeca89762d2dde854e135f38de891a20@rebleep> <MPG.198abfdc1c99450c98a2e2@news.alt.net>
Mime-Version: 1.0
Content-Type: text/plain; charset=us-ascii
Content-Transfer-Encoding: 7bit
X-Newsreader: Forte Agent 1.92/32.572
Xref: news.nzbot.com alt.fan.karl-malden.nose:200
On Fri, 25 Jul 2003 05:18:07 -0400, Vtrous <byte@fiber.cable> had a
scratch and wrote:
>In article <eeca89762d2dde854e135f38de891a20@rebleep>, soja@oblivion.net
>says...
>> Well, since it looks as if the dust has settled on the challenge,
>> I'll provide my answer before the original post expires on your
>> NNTP servers... The essential question regards the choice of byte
>> size: why was it (in practical terms) 12 bits early on and why did
>> it switch to 8 bits later?
>>
>> The first question is easier. The original size of the byte was
>> dictated by the permanent storage medium of the day. The use of 12
>> bit data fragments came from the IBM punch card, which was laid out
>> in 80 columns of 12 rows. Why that size? It was the result of the
>> Hollerith code, devised by Herman Hollerith (who also founded the
>> company that became IBM) for the representation of alphanumeric data
>> on punch cards, in an era when "computer" referred to people, not
>> machines. In the Hollerith code, 12 positions were needed to
>> represent the characters found on a typewriter keyboard. And, no,
>> that wasn't the only choice possible
>> (see http://www.fourmilab.ch/documents/univac/cards.html for an
>> alternative) but it came to dominate the market in a scenario eerily
>> similar to M$'s dominance of the software biz today. For those who
>> might ask why punch cards were used at all, the answer lies in part
>> in the 19th Century introduction of the Jacquard loom, which used
>> punched cards to automate the movement of the loom [see "The
>> Difference Engine" by Sterling and Gibson for a fictionalized description].
>>
>> So, in the 1960s we see a change in the popular size of the byte from
>> 12 bits to 6 bits. Why? The answer is two pronged (at the very
>> least), resulting from the (gradual) transition from punch cards to
>> magnetic and paper tapes as the dominant permanent storage media of
>> the day and the adoption of the ASCII code. Obviously, there's
>> nothing instrinsic to these other media that required the abandonment
>> of 12-bit bytes, but it stimulated a rethinking of the appropriate
>> size of a "byte" (though that term wasn't coined until later). So,
>> why 8 bits? Well, the folks developing the ASCII code found that
>> they needed to represent a minimum of 75 characters (upper/lower case,
>> numbers and the punctuation found on a typewriter keyboard), so that
>> meant 7 bits. So, why not a 7 bit byte? The answer to that is in part
>> again IBM. Just like M$ today, they were always trying to develop
>> their own de facto standard, so they came up with an alternate code,
>> EBCDIC, for the binary encoding of alphanumeric data. EBCDIC, in
>> contrast to ASCII, was an 8-bit code. So, when IBM introduced its new
>> 360 series computer, it used 8-bit bytes. As one of the dominant
>> computer manufacturers of its day, IBM was able to command a lot of
>> attention from the people making peripherals like tape drives,
>> so magtape was devised to be 8-track (there were also 7-track drives
>> made, but like Beta VCRs they are now dust). Other manufacturers soon
>> found a use for the extra bit, using it as a parity bit for error
>> checking, since magnetic media were subject to transcription errors.
>> That was the final straw, meaning that the de facto standard for
>> magtape and paper tape was 8-bit, leading to the universal adoption of
>> 8-bit bytes.
>
>I have a question: Who gives a shit?
>
More to the point - the author is too fucking stupid to actually know
that the definition of a byte is not 8 bits. They've mistaken a
convention of architecture for some sort of defined standard.
I challenge them to quote the relevant standard (it does exist, btw)
that defines byte size.
--
Pussie sHaveMo
reFunMeow MeowMeowPus
siesHaveMore FunMeowMeowMeo
wPussiesHaveMo reFunMeowMeowMeo
wPussiesHaveMor eFunMeowMeowMeowP
ussiesHaveMoreFu nMeowMeowMeowPussi
esHaveMoreFunMeo wMeowMeowPussiesHav
eMoreFunMeowMeowM eowPussiesHaveMor
eFunMeowMeowMeowP ussiesHaveMoreFunM
eowMeowMeowPussie sHaveMoreFunMeowMe owMeowPus
siesHaveM oreFunMeowMeowMe owPussiesHaveMore FunMeowMeowMe
owPussiesHave MoreFunMeowMeowM eowPussiesHaveM oreFunMeowMeowM
eowPussiesHaveM oreFunMeowMeow MeowPussiesHa veMoreFunMeowMeow
MeowPussiesHave MoreFunMeow MeowMeow PussiesHaveMoreFun
MeowMeowMeowPussi esHa veMoreFunMeowMeowMe
owPussiesHaveMoreF unMeowMeowM eowPussiesHaveMoreFu
nMeowMeowMeowPussie sHaveMoreFunMe owMeowMeowPussiesHav
eMoreFunMeowMeowMeo wPussiesHaveMoreFu nMeowMeowMeowPussie
sHaveMoreFunMeowMeo wMeowPussiesHaveMoreFu nMeowMeowMeowPussi
esHaveMoreFunMeowM eowMeowPussiesHaveMoreFu nMeowMeowMeowPuss
iesHaveMoreFunMe owMeowMeowPussiesHaveMoreFu nMeowMeowMeowP
ussiesHaveMor eFunMeowMeowMeowPussiesHaveMore FunMeowMeow
MeowPussi esHaveMoreFunMeowMeowMeowPussiesHave MoreF
unMe owMeowM eowPuss
iesHave MoreF unMeowMeowM
eowPussies Ha veMoreFunMeo
wMeowMeowP us si esHaveMoreFu
nMeowMeowM eow Pus siesHaveMore
FunMeowMeo wMeow Pussi esHaveMoreFu
nMeowMeow MeowPu ssiesH aveMoreFun
Meow MeowMeow Pussie
sHaveMoreFunMeowMeowMeowPussiesHaveM
oreFunMeowMeowMeowPussiesHaveMo
reFunMeowMeowMeowPussies
|
|