Information about computers
The computer is an electronic tool, an electronic brain, which performs and completes a number of arithmetic or logical operations according to a given program. Performs logical and arithmetic operations with the data received from the user; able to store the results of his / her operations; It is an electronic machine whose information can be accessed at any time.
Computer
Computer Alm. Computer (m). Fr. Ordinateur (m). Ing. Computer. electronic device, electronic brain, which performs a work of a large number of arithmetic or logical operations according to a given program. Performs logical and arithmetic operations with the data received from the user; able to store the results of his / her operations; It is an electronic machine whose information can be accessed at any time. When the information about the subject to be handled and how it will be processed is transmitted to the computer, it accomplishes a few seconds of work that many people cannot finish with years of work.Nowadays, it is also known as electronic brain because it is widely used in semiconductor technology. But historically, the first computers were mechanical. (Eg Charles Babbage's differential machine).
This is the first time in our country called "Komputer" or "Electronic Brain". Aydın Köksal, who was working at Hacettepe University in 1969, used this word in a newspaper advertisement to rent the computer they needed.
Computers can come in many different ways. The first computers of the mid-20th century were the size of a large room and consumed hundreds of times more power than today's computers. By the beginning of the 21st century, computers were able to fit into a wristwatch and run on a small battery. Our society recognized the personal computer and its portable equivalent, the laptop, as symbols of the information age and identified it with the concept of the computer. They are widely used today.
Their ability to run at any time makes computers versatile and distinguishes them from their calculators. The Church-Turing thesis is the mathematical expression of this versatility and underlines that any computer can perform the tasks of another computer. So, regardless of their complexity, from pocket to supercomputer, they can all perform the same tasks without memory and time constraints.
You shouldn't think of the computer as if you have an intelligence. On the contrary, the computer itself is a product of human intelligence. It is a tool that is in need of human for its development as well as its proper and proper use. So to speak, the superiority of man; the same process without getting bored, distracted, can perform precision and quickly many times. Computer to work correctly; however, it is only possible to tell him what to do and how to do it with impossible orders.
History
The first general purpose self-computing tool that can be considered the ancestor of modern computers is the Analytical Machine developed by Charles Babbage. This machine was able to process commands written on punch cards.The first computer that comes to mind when it comes to computers, personal computers were developed by IBM in 1981 for the first time and suddenly became the industrial standard. IBM PCs were Intel processors and shipped with Microsoft DOS.
Working system
Parts of computers: Microcomputers are a simple instrument with a typewriter keys and a television screen at a glance. Various auxiliary devices can be connected to this instrument. Many other devices such as disk memory, printer, graphic plotter can be connected to micro computers. These are the hard parts of the computer (hardware).Programs: To give life to the computer, it is necessary to transfer full information to it. Each statement describing the execution of a particular job is called a program. The sum of all programs is called soft parts of the computer (software).
Computer logic: You need to understand how the computer uses its information to learn how it works. Letters and numbers are used after they are expressed as codes on the computer. In computers, codes are expressed by the presence or absence of voltage. Voltage is present, if the lamp is on 1; no voltage, 0 if no lamp is on. This two-state coding is called a "binary system". Any information given to the computer from the keys is converted to 1 and 0 codes.
Each 0 and 1, in bits; the group of eight bits is defined as bytes. The computer performs its operations with a binary number system. Operations are very simple and simple, but very fast.
Logic: Computers can not only count numbers but can also decide. These decisions are based on logic rules called Boolean algebra. According to various conditions, the computer can make decisions such as YES, NO, AND, OR, NOT. E.g; To transport the house you need a truck AND a driver. If this truck has to cross a narrow bridge, it hits the bridge if the truck is wide OR high. If the house to be moved is NOT empty, the move will be delayed. Here, AND, OR, NOT decisions are given.
Operation of computers: Computers consist of four main parts: memory, input, output and central processing unit. CPU performs, collects, subtracts and compares operations in sequence.
Programs such as addition and subtraction are pre-taught to the CPU. It is sufficient to write two numbers from the input to the computer. The CPU has sections such as clock, program counter, information descriptive arithmetic and logic part.
CPU: The heart of the computer. The commands to be executed are brought to the CPU in sequence. The command is executed considering the contents of the command and the current state of the computer. All kinds of arithmetic and logical operations are performed in the ALU, which is part of the CPU. The result of the operations is temporarily taken to the special memory cell called the accumulator and delivered to the destination. The program counter stores the next command in memory. The watch provides the vibrations necessary for the electronic operation of the CP. The frequency of these vibrations is the main factor determining the speed of the computer.
Etymology
Computer word, "software", "hardware", "informatics", "information processing" with the words such as Aydın Köksal was introduced into Turkish.While the spaceship revolves around the world, those who use it are not pilots. Every movement of the ship is controlled by a computer. Today, workplaces prepare the workers' wages for computers and meteorology makes the weather forecast with them. Newspapers, magazines, computers are prepared, children work with them, intelligence information is stored on computers.
The history of the computer: The first electronic computer operating in the United States with a touch-button system required many materials and hundreds of years of scientific experience to become operational at the University of Pennsylvania in 1946. In short, no one took the Electronic Number Integration and Calculator known as “ENIAC o seriously that day. The machine did not take too long to operate. Because he was constantly shorting the lamps. For half a million dollars, ENIAC was designed to calculate artillery firing schedules. It was able to perform 5000 subtraction and addition operations per second. Today, even any "home computer" works more than this great ENIAC.
But it wasn't easy to get to this day. Human beings began to play with numbers from the day they existed. Previously, stone fragments were used for this work, which is supposed to be derived from the Latin word for calculus. 2500 years ago, the Chinese saw that they could save time with the beads that move easily on the ropes in their accounting work.
In 1642, Frenchman Blaise Pascal developed a machine which he thought would help his tax collector father. Addition or subtraction could be done automatically when the small wheels were turned a little. However, the clerk who earned his livelihood from hours of accounting saw Pascal's machine as a competitor and did not compliment him at all.
After a while, the German mathematician Gottfried Wilhelm Leibniz added the ability to multiply and divide the machine. According to Leibniz; “Distinguished people, like prisoners, were not worthy to lose hours in the computational work.”
These machines could only be used in four-process arithmetic calculations. Charles Babbage, the 19th-century English mathematician, was the first to think of the real computer, a mathematics and a calculator that could do much more. Babbage, known as the father of the first reliable indicator and the speedometer used in the locomotives, embarked on the production of a machine capable of automatically calculating long functions such as logarithms. First, he made “differentiation machine.. This machine was a complex system consisting of various gears and gears. Babbage wanted to overcome this simple model. This time, he pursued a project he called “analytical machine.. The new machine would operate in a much more complex way than the old machine. But in this new machine it was possible to find all the basic features of today's calculators. A logical processing center called "brain", which Babbage called "the mill", could store various information according to certain principles. There would be a “memory store için for the storage of information, a unit of supervision of the orders issued to it, as well as mechanisms for loading or receiving information from the machine. Above all, all operating principles of the machine could be changed if desired. The “analytical machine ti could be programmed.
Babbage worked on his machine for almost 40 years. But the “analytical machine em was never manufactured. Babbage worked hard to make this device, the size of a football field and capable of processing with at least six steam engines. When the government stopped helping, he found money from his friends. But he failed. In contrast, one of the key principles had aroused relevance around it.
Babbage thought it would be useful to use punch cards to input information into his machine. Algebra models, like cards used in color and pattern separations on French looms, could be knitted by the analytical machine. In the US, a young engineer named Herman Hollerith tried to adopt the idea of using a punch card before the 1890 census. The characteristics of the people such as age, gender, marital status and race could be indicated on the cards in holes with a certain coding method and the data on these cards could be calculated with electrical readers. The use of punch cards in office machines has become widespread in an instant after the surprising ease of calculation of this census. An unrecognized American company that bought Hollerith's small company gave priority to this system in its machines. Today, this company was the core of the giant IBM.
The real computer in Babbage's dreams could not be realized until the 1930s. Konrad Zuse, an unknown engineer in Hitler Germany, was able to manufacture a simple computer in the living room of his father's house, which he used as a workshop. This computer, which was able to provide various services, was also used in accounting for the German aircraft industry during the Second World War. In 1939, a mathematician named George Stibitz, who worked in the research laboratory of the Bell Telephone Company in the USA, built a similar machine. He even showed us how to make account transactions with phone lines. This was the first time that remote information transmission took place. After the war, a computer he called oss Colossus Alan in England, Alan Turing, was able to decipher the German military codes.
The American, German and British computers all had one thing in common. These were the pioneers of computer calculators used today with dual systems. Babbage's famous machine was able to count back and forth. This was called the “decimal” system. It is thought that the habit of using numbers from 0 to 9 is due to the calculation of l0 fingers in the hands and feet of mankind.
In computers, operations are performed with “open-closed” electronic circuits called “logical gates.. The information given to the machine passes through these gates with electrical currents. Even in a small “home computer vard there are thousands of such doors. These open and close more than a million times per second. This switching takes place in an electronic circuit.
The ancestors of current key calculators could not work so fast. Electromechanical circuits were used. These circuits worked according to the old Morse code, IBM's first big computer, “Mark 1 çalışıy, which worked just like a woolen woman in a room. US physicist Jeremy Bernstein recalls: “The machine was calculating two 23-digit numbers in 5 seconds. Today, pocket calculators don't wait a second to get the same result. ”
Instead of electromechanical transport, ENIAC used radio lamps to switch the circuit on and off. Therefore, the account speed was increasing. But the challenges were not over yet. In order to perform different operations, ENIAC had to be switched from one plug to another by hand, just like old telephone exchanges. In some cases, this rebuilding took days.
Mathematician John von Neumann found the solution. The working instructions given to the machine entered just like the loaded information. They could be written in binary and preprogrammed. A reel of tape or keyboard was enough for this. It was the Sperry Rand company that launched the first computer with this capability. While UNIVAC-1 was handed over to the US Population Bureau in 1959, other company executives who were late on this matter did not know what to do.
In 1947, three experts working in the laboratories of Bell Telephone found the “transistor küç, a tiny and simple thing. This resistance is an abbreviation of the word transplants, and a semi-permeable material sandwich in the invention. Germanium crystals were the main ingredient in this sandwich. Silicon was to be used in later years. These crystals were placed so that the smallest electrical current from one side of the sandwich could control a much larger current in the next circuit. It was easier to control the on / off switch and the rise and down of the electrons. They were much smaller than vacuum radio lamps. Moreover, they were both faster calculators and less distorted. It was not a matter of placing them on top of each other because they gave little heat. Another advantage is that they cost very cheaply.
Over the past few years, Bell experts have been able to build a fully transistor computer. One of the three architects of the invention, Shockley immediately returned to his hometown of California and started his own business. This region, now known as the Silicon Valley in Polo Alto, was to become one of the hearts of the computer industry. In Dallas, a young entrepreneur making drilling machines for oil wells wanted to expand his Texas Instruments firm. He hired Schockley's former colleague at Bell, Gordon Teal. The biggest customer they produced was the Pentagon. The US Department of Defense considers that transistor devices provide great computational ease for guiding missiles.
The first computers built with transistors were in a way reminiscent of old radios. Each piece was soldered together. Electronics manufacturers soon thought that these links could be “printed otomatik automatically on a table. There was no need to write by hand. At the end of the 1960s, Robert Noyce of Texas Instruments thought of the same thing at the same time. On a single piece of silicone, the desired amount of transistor, with connections between them, could be directly molded into the picture. Such “integrated circuits birinin would include the whole of one of the components of a computer, for example a logic circuit or memory recorder.
Hundreds of thousands of transistor molds were possible on a silicon slice. An increasing number of circuits were added to the ch microchips her every day. But yet all the difficulties were not overcome. The circuits placed on the silicon were not flexible. They were stereotyped to do what they were doing. As the experts say, there was a “hard connection.. In 1971, Intel developed the “Microprocessor”. Ted Hoff's invention succeeded in consolidating the central information evaluation unit into a single silicon slice.
Thanks to the microprocessor, a single slice could be programmed to perform as many tasks as required. This goes from running an hour to running a feza ship. The main principle of small computers used in offices and known as “home computers bu was the invention. In 1975, new computers were introduced. Soon he disappeared from the market. But there were so many new computers to replace it. The imaginative, young and dynamic teams did not stand idle. They're still not empty. They continue to advance the computer in giant steps.
Parts of computers: Microcomputers are a simple instrument with a typewriter keys and a television screen at first glance. Various auxiliary devices can be connected to this instrument. Many other devices such as disk memory, printer, graphic plotter can be connected to micro computers. All these constitute the "hard parts" of the computer.
Programs: To give life to the computer, it is necessary to transfer full information to it. Each statement describing the execution of a particular job is called a program. The sum of all programs is called the “soft parts software of the computer.
Computer logic: You need to understand how the computer uses its information to learn how it works. Letters and numbers are used after they are expressed as codes on the computer. In computers, codes are expressed by the presence or absence of voltage. Voltage is present, if the lamp is on 1; no voltage, 0 if no lamp is on. This two-state coding is called a "binary system". Any information given to the computer from the keys is converted to 1 and 0 codes.
Each 0 and 1, in bits; the group of eight bits is defined as bytes. The computer performs its operations with a binary number system. Operations are very simple and simple, but very fast.
Logic: Computers can not only count numbers but can also decide. These decisions are based on logic rules called Boolean algebra. Depending on the various circumstances, the computer can make decisions such as “YES”, “NO”, “AND”, “OR”, “NOT”. E.g; To transport the house you need a truck AND a driver. If this truck has to cross a narrow bridge, it hits the bridge if the truck is wide OR high. If the house to be moved is NOT empty, the move will be delayed. Here, AND, OR, NOT decisions are given.
Operation of computers: Computers consist of four main parts: memory, input, output and central processing unit. CPU performs, collects, subtracts and compares operations in sequence.
Programs such as addition and subtraction are pre-taught to the CPU. It is sufficient to write two numbers from the input to the computer. The CPU has sections such as clock, program counter, information descriptive arithmetic and logic part.
CPU: The heart of the computer. The commands to be executed are brought to the CPU in sequence. The command is executed considering the contents of the command and the current state of the computer. All kinds of arithmetic and logical operations are performed in the ALU, which is part of the CPU. The result of the operations is temporarily taken to a special memory cell called “accumulator, and delivered to the destination. The program counter stores the next command in memory. The watch provides the vibrations necessary for the electronic operation of the CP. The frequency of these vibrations is the main factor determining the speed of the computer.
All units with information available to the computer are included.
Main memory: Consists of a large number of memory cells. Each cell has an “address.. Here, the information is stored electrically. When the power fails, the contents of the main memory are lost.
Auxiliary memory: These are the units where the information is stored considering that it is stored longer than the main memory. They usually rely on magnetic mechanisms. They are cheap in cost and slower in speed than the main memory. Flexible and hard drives, magnetic strips are the most common types of auxiliary memory.
Input devices: The most common input devices addressed to people are the keyboard. The information that needs to be transferred to the computer can be entered here via the keys corresponding to the letters, numbers and special signs. Because of its appearance, the so-called “mouse giriş input device is also widely used. Other input devices include light pens, tablets, sound decoders, etc. countable.
Computer output devices: Output devices, such as input devices, address people again. The computer language changes again at output. For micro computers, the normal television screen is used as output. Custom screens are made for more precise viewing. These are called VDU for short. Printers are used if the outputs are to be recorded. The most commonly used printer is the dotted printer. Various thin wires write letters or numbers in ink with dots on paper. Electric spark printers burn paper by dot dots. They work very quietly. Rotary wheel printers are more varied. Letters, numbers and punctuation are available on the wheel. When the character to be printed is at the level of the hammer on the rotating wheel, the hammer hits the wheel and traces the paper. Slower than other printers.
Computer programming: The computer uses only the machine language consisting of 0 and 1s. This language is difficult for human programmers. In this respect, special computer languages have been developed. Basic German, English language, Basic, Pascal, APL, Cobol and Fortran. As soon as the programmer prints the program to the computer with one of these languages, the computer translates the program into the machine language via the compiler itself.
Areas where the computer is used: Computers are used in five main areas: Evaluating and deciding the information collected as information processing. Problem solving as a calculator. As a printing process, it stores information as memory, especially in press jobs that need to be printed quickly, such as newspapers and magazines. It is also used as a control to start and stop other devices.
Nowadays, many English terms in computer are used without Turkish equivalents. Below are the most common ones.
Access: Access to information or use of computer or program.
Address: Description of the location of the piece of information in the computer's memory.
Assembly Language: Programming language for machine use. Each CPU has its own machine language.
Basics (Beginner's All-purpose Symbolic Instruction Code): Commonly used programming language in small and personal computers.
Binary: Binary system using the numbers 0 and 1.
Buffer: The location where the information is temporarily placed.
Bug: Error in the program or electrical systems of the units. Debugging is the fix of this error.
Byte: Eight rows in the binary system. Each byte corresponds to a letter, number or symbol. The capacities of computers are usually measured in bytes.
Compiler: The program that converts the process in written programming languages into machine language.
CPU (Central Processing Unit): The unit that processes the commands given to the system by the user.
Chip: The name given to the integrated circuit containing the coded signal.
Cursor: A marker on the screen indicating the working place.
Database: A large and organized collection of information.
Density: The amount of information that can be placed on a part of a face of a disc.
Disc: The rotating plate on which the information is placed.
Disk Drive: A layout that places information on a disk and can read back information from it.
DOS (Disk Operating System): A collection of programs used to perform disk and related operations.
Error message: Information that informs the user that he or she has done wrong.
File: A collection of information with a specific name.
Floppy disk: Cheap small disk used to store information.
Format: The order in which information is stored.
Graphics: Pictures and shapes in computer programs.
Hardware: The physical part of the computer.
Hexadecimal: Number system of 16. Information programs are usually written in this language.
Interface: Connecting one system to another.
Menu: User-ready options on the screen.
Memory: Memory.
Modem: A unit that allows the transfer of information from one system to another using a telephone or a direct line.
Monitor: Display of information from the computer.
Printer: The unit that prints the results on paper.
Program: Computer-coded orders to perform a specific operation.
RAM (Random Access Memory): The user-changeable memory section that contains (includes) topics.
ROM (Read Only Memory): The memory portion of the contents that cannot be changed by the user.
Software: Programs, orders, transactions outside the physical part of the computer can be described as.
Terminal: A working unit that is separate from the host computer but accessible to it.
User Friendliness: Easy to use computer.
Word Processor: A text processing program or system used for electronic writing, processing, and editing.
Hiç yorum yok