How the binary nature of computers affects our data types

This article is part of the sequence The Basics You Won't Learn in the Basics aimed at eager people striving to gain a deeper understanding of programming and computer science.

In the past few weeks, we have discussed the different ways computers deal with binary numbers in order to represent the numbers we are used to see - positive, negative and real. This time, we will take a step back from diving in the details of how the hardware deals with such issues andĀ focus on how the design decisions, taken by computer architects, affect the way we represent data in our code. Particularly, we shall explore the different "features" that data types, that we use in our code, have hidden for us.
Continue Reading

Introduction to binary numbers

This article is part of the sequence The Basics You Won't Learn in the Basics aimed at eager people striving to gain a deeper understanding of programming and computer science.

Last time, we covered how does a processor work. We mentioned that he used instructions, which are encoded in numbers. But these numbers are stored in a computer in binary digits. Today, I begin a series on posts on how binary numbers work.
Continue Reading

Site Footer

BulgariaEnglish