Enhance Your Learning with Computer Architecture Flash Cards for quick learning
The design and organization of a computer system, including its hardware components and the way they interact to execute instructions.
The primary component of a computer that performs most of the processing inside the computer. It executes instructions, performs calculations, and manages data flow.
A hierarchy of storage devices in a computer system, ranging from fast and expensive to slow and inexpensive. It includes registers, cache, main memory, and secondary storage.
The interface between the hardware and software of a computer system. It defines the instructions that a computer can execute and the format of those instructions.
The components and techniques used to communicate with external devices, such as keyboards, mice, printers, and storage devices.
The branch of computer science that deals with the design and implementation of arithmetic operations in computer systems, including addition, subtraction, multiplication, and division.
A technique used in computer architecture to increase instruction throughput. It divides the execution of instructions into a series of stages, allowing multiple instructions to be processed simultaneously.
The process of controlling and coordinating computer memory, including allocating and deallocating memory space, managing virtual memory, and handling memory access conflicts.
A small, fast memory component used to store frequently accessed data or instructions. It helps improve the performance of a computer system by reducing memory access time.
A memory management technique that allows a computer to compensate for physical memory shortages by temporarily transferring data from RAM to disk storage.
The simultaneous execution of multiple instructions or tasks by dividing them into smaller subtasks that can be processed in parallel by multiple processors or cores.
A collection of interconnected devices, such as computers, servers, routers, and switches, that communicate with each other to share resources and information.
The process of assessing and measuring the performance of a computer system, including its speed, efficiency, and resource utilization.
The ability of a computer system to continue operating properly in the event of hardware or software failures, ensuring uninterrupted availability and reliability.
The latest advancements and developments in computer architecture, including technologies such as cloud computing, quantum computing, and artificial intelligence.
The sequence of steps that a computer's CPU follows to fetch, decode, execute, and store an instruction.
A computer architecture design that uses a single memory to store both instructions and data, allowing instructions to be treated as data and enabling the execution of stored programs.
A processor that uses pipeline processing to increase instruction throughput by dividing the execution of instructions into multiple stages.
The consistency of data stored in different caches that are part of a multiprocessor system. It ensures that all processors observe a consistent view of memory.
The ability to execute multiple instructions in parallel within a single program, achieved through techniques such as instruction pipelining and superscalar execution.
A signal sent to the CPU by an external device or software, indicating that it requires attention or action. It can be used to handle events such as keyboard input or hardware errors.
The set of all instructions that a computer's CPU can execute. It defines the operations that can be performed, the data types that can be manipulated, and the addressing modes used.
The different ways in which a computer's CPU can specify the location of data or instructions in memory, such as direct addressing, indirect addressing, and indexed addressing.
The process of determining how data is stored and retrieved in a cache memory. Common mapping techniques include direct mapping, associative mapping, and set-associative mapping.
A memory management technique that divides a computer's virtual memory into fixed-size blocks called pages, allowing for efficient memory allocation and management.
Different approaches to parallel processing, such as SIMD (Single Instruction, Multiple Data), MIMD (Multiple Instruction, Multiple Data), and SPMD (Single Program, Multiple Data).
The physical or logical layout of a computer network, including how devices are connected and the paths that data takes to travel between them. Common topologies include bus, star, and mesh.
The process of measuring and evaluating the performance of a computer system or component, often by running standardized tests or comparing against known standards.
A data storage technology that combines multiple physical disk drives into a single logical unit for improved performance, reliability, or both.
A model for delivering computing resources over the internet, allowing users to access and use applications, storage, and processing power on-demand, without the need for local infrastructure.
A computing paradigm that uses quantum bits, or qubits, to represent and manipulate data. It has the potential to solve certain problems much faster than classical computers.
The simulation of human intelligence in machines, enabling them to perform tasks that typically require human intelligence, such as speech recognition, decision-making, and problem-solving.
A network of interconnected physical devices, vehicles, appliances, and other objects embedded with sensors, software, and network connectivity, enabling them to collect and exchange data.
Large and complex data sets that cannot be easily managed, processed, or analyzed using traditional data processing techniques. It often involves high-volume, high-velocity, and high-variety data.
A subset of artificial intelligence that focuses on the development of algorithms and models that allow computers to learn and make predictions or decisions without explicit programming.
A type of machine learning model inspired by the structure and function of the human brain. It consists of interconnected nodes, or artificial neurons, that process and transmit information.
The practice of protecting computer systems, networks, and data from digital attacks, theft, and damage. It involves implementing security measures and protocols to prevent unauthorized access or use.
A service that allows users to store and access data over the internet, eliminating the need for local storage devices. It offers scalability, accessibility, and data redundancy.
The use of portable computing devices, such as smartphones and tablets, to access and use computing resources and applications on the go, typically through wireless networks.
Computer systems designed to perform specific tasks or functions within larger systems or devices. They are often dedicated, real-time systems with limited resources and power constraints.
The process of reducing the size of data to save storage space or transmission bandwidth. It involves encoding data in a more efficient representation, often by removing redundant or irrelevant information.
The creation, manipulation, and rendering of visual content using computers. It encompasses areas such as 2D and 3D graphics, image processing, animation, and virtual reality.
Software that manages computer hardware and software resources, provides common services for computer programs, and allows them to run efficiently and interact with users.
A network of computers that work together to achieve a common goal, often by dividing tasks among multiple nodes. It enables scalability, fault tolerance, and resource sharing.
The protection of computer systems and data from unauthorized access, use, disclosure, disruption, modification, or destruction. It involves implementing security measures and policies.
The application of engineering principles and practices to the design, development, testing, and maintenance of software systems. It involves managing complexity, ensuring quality, and meeting user requirements.
The study of ethical issues and moral dilemmas related to the use of computers and technology. It involves considering the impact of technology on individuals, society, and the environment.
Malicious software programs that replicate themselves and spread to other computers, often causing damage or disrupting normal computer operations. They can be transmitted through email, downloads, or infected files.