Why did early computer designers eschew integers? The Next CEO of Stack OverflowWhat register size did early computers use?What other computers used this floating-point format?Why did so many early microcomputers use the MOS 6502 and variants?Why did keygens play music?Why were early computers named “Mark”?Why did expert systems fall?Why were early personal computer monitors not green?When did “Zen” in computer programming become a thing?History of advanced hardwareWere there any working computers using residue number systems?

Small nick on power cord from an electric alarm clock, and copper wiring exposed but intact

What happened in Rome, when the western empire "fell"?

Does destroying a Lich's phylactery destroy the soul within it?

What flight has the highest ratio of timezone difference to flight time?

Does the Idaho Potato Commission associate potato skins with healthy eating?

(How) Could a medieval fantasy world survive a magic-induced "nuclear winter"?

Help! I cannot understand this game’s notations!

How to avoid supervisors with prejudiced views?

Why don't programming languages automatically manage the synchronous/asynchronous problem?

What does "shotgun unity" refer to here in this sentence?

Can I board the first leg of the flight without having final country's visa?

Easy to read palindrome checker

Can I calculate next year's exemptions based on this year's refund/amount owed?

Why did early computer designers eschew integers?

Towers in the ocean; How deep can they be built?

Yu-Gi-Oh cards in Python 3

Can you teleport closer to a creature you are Frightened of?

Do I need to write [sic] when including a quotation with a number less than 10 that isn't written out?

Is French Guiana a (hard) EU border?

Calculate the Mean mean of two numbers

Is it ever safe to open a suspicious HTML file (e.g. email attachment)?

What day is it again?

Do scriptures give a method to recognize a truly self-realized person/jivanmukta?

Is it okay to majorly distort historical facts while writing a fiction story?



Why did early computer designers eschew integers?



The Next CEO of Stack OverflowWhat register size did early computers use?What other computers used this floating-point format?Why did so many early microcomputers use the MOS 6502 and variants?Why did keygens play music?Why were early computers named “Mark”?Why did expert systems fall?Why were early personal computer monitors not green?When did “Zen” in computer programming become a thing?History of advanced hardwareWere there any working computers using residue number systems?










2















Several early computer designs regarded a 'word' as representing not an integer, with the bits having values 2^0, 2^1, 2^2, ..., but as representing a fixed-point fraction 2^-1, 2^-2, 2^-3, ...



(For the sake of simplicity in this question I'm ignoring the existence of the sign bit and talk only in terms of positive numbers)



Some examples of this convention are EDVAC, EDSAC, and the IAS machine.



Why was this? To me, having dealt with since the 1970s with machines that have "integers" at base, this seems a strange way to look at it.



Does it affect the machine operation in any way? Addition and subtraction are the same regardless of what you think the bits mean, but I suppose that for multiplication of two N-bit words giving an N-bit result, the choice of which N bits to keep depends on your interpretation. (Integer: you want the "right hand word"; fixed-point fraction, you want the "left hand word").










share|improve this question






















  • Very early on, it was likely that computers were not considered to be general purpose machines. So if the main task for which a computer was designed involved doing calculations with flractional numbers, prioritizing them over integers would make sense. It seems likely that computers designed for business programs would be more tuned to integers, because money (in the USA) can be treated as pennies, and very little would need to be fractional.

    – RichF
    47 mins ago
















2















Several early computer designs regarded a 'word' as representing not an integer, with the bits having values 2^0, 2^1, 2^2, ..., but as representing a fixed-point fraction 2^-1, 2^-2, 2^-3, ...



(For the sake of simplicity in this question I'm ignoring the existence of the sign bit and talk only in terms of positive numbers)



Some examples of this convention are EDVAC, EDSAC, and the IAS machine.



Why was this? To me, having dealt with since the 1970s with machines that have "integers" at base, this seems a strange way to look at it.



Does it affect the machine operation in any way? Addition and subtraction are the same regardless of what you think the bits mean, but I suppose that for multiplication of two N-bit words giving an N-bit result, the choice of which N bits to keep depends on your interpretation. (Integer: you want the "right hand word"; fixed-point fraction, you want the "left hand word").










share|improve this question






















  • Very early on, it was likely that computers were not considered to be general purpose machines. So if the main task for which a computer was designed involved doing calculations with flractional numbers, prioritizing them over integers would make sense. It seems likely that computers designed for business programs would be more tuned to integers, because money (in the USA) can be treated as pennies, and very little would need to be fractional.

    – RichF
    47 mins ago














2












2








2








Several early computer designs regarded a 'word' as representing not an integer, with the bits having values 2^0, 2^1, 2^2, ..., but as representing a fixed-point fraction 2^-1, 2^-2, 2^-3, ...



(For the sake of simplicity in this question I'm ignoring the existence of the sign bit and talk only in terms of positive numbers)



Some examples of this convention are EDVAC, EDSAC, and the IAS machine.



Why was this? To me, having dealt with since the 1970s with machines that have "integers" at base, this seems a strange way to look at it.



Does it affect the machine operation in any way? Addition and subtraction are the same regardless of what you think the bits mean, but I suppose that for multiplication of two N-bit words giving an N-bit result, the choice of which N bits to keep depends on your interpretation. (Integer: you want the "right hand word"; fixed-point fraction, you want the "left hand word").










share|improve this question














Several early computer designs regarded a 'word' as representing not an integer, with the bits having values 2^0, 2^1, 2^2, ..., but as representing a fixed-point fraction 2^-1, 2^-2, 2^-3, ...



(For the sake of simplicity in this question I'm ignoring the existence of the sign bit and talk only in terms of positive numbers)



Some examples of this convention are EDVAC, EDSAC, and the IAS machine.



Why was this? To me, having dealt with since the 1970s with machines that have "integers" at base, this seems a strange way to look at it.



Does it affect the machine operation in any way? Addition and subtraction are the same regardless of what you think the bits mean, but I suppose that for multiplication of two N-bit words giving an N-bit result, the choice of which N bits to keep depends on your interpretation. (Integer: you want the "right hand word"; fixed-point fraction, you want the "left hand word").







history






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked 1 hour ago









another-daveanother-dave

1,162112




1,162112












  • Very early on, it was likely that computers were not considered to be general purpose machines. So if the main task for which a computer was designed involved doing calculations with flractional numbers, prioritizing them over integers would make sense. It seems likely that computers designed for business programs would be more tuned to integers, because money (in the USA) can be treated as pennies, and very little would need to be fractional.

    – RichF
    47 mins ago


















  • Very early on, it was likely that computers were not considered to be general purpose machines. So if the main task for which a computer was designed involved doing calculations with flractional numbers, prioritizing them over integers would make sense. It seems likely that computers designed for business programs would be more tuned to integers, because money (in the USA) can be treated as pennies, and very little would need to be fractional.

    – RichF
    47 mins ago

















Very early on, it was likely that computers were not considered to be general purpose machines. So if the main task for which a computer was designed involved doing calculations with flractional numbers, prioritizing them over integers would make sense. It seems likely that computers designed for business programs would be more tuned to integers, because money (in the USA) can be treated as pennies, and very little would need to be fractional.

– RichF
47 mins ago






Very early on, it was likely that computers were not considered to be general purpose machines. So if the main task for which a computer was designed involved doing calculations with flractional numbers, prioritizing them over integers would make sense. It seems likely that computers designed for business programs would be more tuned to integers, because money (in the USA) can be treated as pennies, and very little would need to be fractional.

– RichF
47 mins ago











1 Answer
1






active

oldest

votes


















2














I'd think that it was mostly down to the preferences of John von Neumann at the time. He was a strong advocate of fixed point representations, and early computers were designed with long words to accommodate a large range of numbers that way. You certainly don't need 30-40 bits to cover the most useful integers, but that many were needed if you wanted plenty of digits before and after the decimal point.



By the 1970s though, the costs of integration were such that much smaller word sizes made sense. Minicomputers were commonly 16 bit architectures, and micros 8 bits or sometimes even 4. At that point you needed all the integers you can get, plus floating point had largely replaced fixed point for when you needed decimals.



Nowadays we'd think nothing of using 64 bit integers, of course, but it's a heck of a lot easier to integrate the number of logic gates required for that than it would have been back when they all had to be made out of fragile and expensive vacuum tubes.






share|improve this answer








New contributor




Matthew Barber is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.




















    Your Answer








    StackExchange.ready(function()
    var channelOptions =
    tags: "".split(" "),
    id: "648"
    ;
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function()
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled)
    StackExchange.using("snippets", function()
    createEditor();
    );

    else
    createEditor();

    );

    function createEditor()
    StackExchange.prepareEditor(
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: false,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: null,
    bindNavPrevention: true,
    postfix: "",
    imageUploader:
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    ,
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    );



    );













    draft saved

    draft discarded


















    StackExchange.ready(
    function ()
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fretrocomputing.stackexchange.com%2fquestions%2f9500%2fwhy-did-early-computer-designers-eschew-integers%23new-answer', 'question_page');

    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    2














    I'd think that it was mostly down to the preferences of John von Neumann at the time. He was a strong advocate of fixed point representations, and early computers were designed with long words to accommodate a large range of numbers that way. You certainly don't need 30-40 bits to cover the most useful integers, but that many were needed if you wanted plenty of digits before and after the decimal point.



    By the 1970s though, the costs of integration were such that much smaller word sizes made sense. Minicomputers were commonly 16 bit architectures, and micros 8 bits or sometimes even 4. At that point you needed all the integers you can get, plus floating point had largely replaced fixed point for when you needed decimals.



    Nowadays we'd think nothing of using 64 bit integers, of course, but it's a heck of a lot easier to integrate the number of logic gates required for that than it would have been back when they all had to be made out of fragile and expensive vacuum tubes.






    share|improve this answer








    New contributor




    Matthew Barber is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
    Check out our Code of Conduct.
























      2














      I'd think that it was mostly down to the preferences of John von Neumann at the time. He was a strong advocate of fixed point representations, and early computers were designed with long words to accommodate a large range of numbers that way. You certainly don't need 30-40 bits to cover the most useful integers, but that many were needed if you wanted plenty of digits before and after the decimal point.



      By the 1970s though, the costs of integration were such that much smaller word sizes made sense. Minicomputers were commonly 16 bit architectures, and micros 8 bits or sometimes even 4. At that point you needed all the integers you can get, plus floating point had largely replaced fixed point for when you needed decimals.



      Nowadays we'd think nothing of using 64 bit integers, of course, but it's a heck of a lot easier to integrate the number of logic gates required for that than it would have been back when they all had to be made out of fragile and expensive vacuum tubes.






      share|improve this answer








      New contributor




      Matthew Barber is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.






















        2












        2








        2







        I'd think that it was mostly down to the preferences of John von Neumann at the time. He was a strong advocate of fixed point representations, and early computers were designed with long words to accommodate a large range of numbers that way. You certainly don't need 30-40 bits to cover the most useful integers, but that many were needed if you wanted plenty of digits before and after the decimal point.



        By the 1970s though, the costs of integration were such that much smaller word sizes made sense. Minicomputers were commonly 16 bit architectures, and micros 8 bits or sometimes even 4. At that point you needed all the integers you can get, plus floating point had largely replaced fixed point for when you needed decimals.



        Nowadays we'd think nothing of using 64 bit integers, of course, but it's a heck of a lot easier to integrate the number of logic gates required for that than it would have been back when they all had to be made out of fragile and expensive vacuum tubes.






        share|improve this answer








        New contributor




        Matthew Barber is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
        Check out our Code of Conduct.










        I'd think that it was mostly down to the preferences of John von Neumann at the time. He was a strong advocate of fixed point representations, and early computers were designed with long words to accommodate a large range of numbers that way. You certainly don't need 30-40 bits to cover the most useful integers, but that many were needed if you wanted plenty of digits before and after the decimal point.



        By the 1970s though, the costs of integration were such that much smaller word sizes made sense. Minicomputers were commonly 16 bit architectures, and micros 8 bits or sometimes even 4. At that point you needed all the integers you can get, plus floating point had largely replaced fixed point for when you needed decimals.



        Nowadays we'd think nothing of using 64 bit integers, of course, but it's a heck of a lot easier to integrate the number of logic gates required for that than it would have been back when they all had to be made out of fragile and expensive vacuum tubes.







        share|improve this answer








        New contributor




        Matthew Barber is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
        Check out our Code of Conduct.









        share|improve this answer



        share|improve this answer






        New contributor




        Matthew Barber is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
        Check out our Code of Conduct.









        answered 26 mins ago









        Matthew BarberMatthew Barber

        1211




        1211




        New contributor




        Matthew Barber is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
        Check out our Code of Conduct.





        New contributor





        Matthew Barber is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
        Check out our Code of Conduct.






        Matthew Barber is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
        Check out our Code of Conduct.



























            draft saved

            draft discarded
















































            Thanks for contributing an answer to Retrocomputing Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid


            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.

            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fretrocomputing.stackexchange.com%2fquestions%2f9500%2fwhy-did-early-computer-designers-eschew-integers%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Dapidodigma demeter Subspecies | Notae | Tabula navigationisDapidodigmaAfrotropical Butterflies: Lycaenidae - Subtribe IolainaAmplifica

            Constantinus Vanšenkin Nexus externi | Tabula navigationisБольшая российская энциклопедияAmplifica

            Gaius Norbanus Flaccus (consul 38 a.C.n.) Index De gente | De cursu honorum | Notae | Fontes | Si vis plura legere | Tabula navigationisHic legere potes