简体   繁体   English

计算机如何区分二进制模式是指令还是数字?

[英]How does a computer distinguish whether a binary pattern is an instruction or just a number?

I am reading the book "Computer Organization and Embedded Systems" by Hamacher and my question is: "How does a computer distinguish whether a binary pattern is an instruction or just a number?" 我正在阅读Hamacher的《计算机组织和嵌入式系统》一书,我的问题是:“计算机如何区分二进制模式是指令还是数字?”

Can anyone help me understand that concept? 谁能帮我理解这个概念?

A Von Neumann Processor (pretty much any processor out there) can not distinguish between code and data in memory. 冯·诺依曼处理器(几乎所有处理器都在其中)无法区分内存中的代码和数据。 What ever the instruction pointer of the CPU points to will be loaded into the instruction decoder as an instruction. CPU的指令指针指向的所有内容都将作为指令加载到指令解码器中。 If it is not a valid instruction it will raise an exception in the CPU. 如果它不是有效的指令,它将在CPU中引发异常。

This enables a program to create new executable code in memory or even change its own code. 这使程序可以在内存中创建新的可执行代码,甚至更改其自己的代码。 On the other hand this enables many code injection attacks. 另一方面,这会导致许多代码注入攻击。

The way the computer distinguishes between instructions and numbers simply depends on what is reading the data and where. 计算机区分指令和数字的方式仅取决于读取数据的内容和位置。 For example, a simple Arithmetic Logic Unit (ALU) will include an input for the operation to be done, and two inputs for the operands. 例如,一个简单的算术逻辑单元(ALU)将包括一个用于完成操作的输入,以及两个用于操作数的输入。 The data going into the operand ports is read as numbers, whereas the data going into the operator input is read as an instruction. 进入操作数端口的数据被读为数字,而进入操作员输入的数据被读为指令。

It all depends on what computer architecture unit is reading the data, and on what input that unit is reading it. 这完全取决于哪个计算机体系结构单元正在读取数据,以及该单元正在读取什么输入。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 integer这个数字在“计算机”中是如何转换成二进制的? - How an integer number is converted to binary in a "computer"? 计算机如何理解时间?没有asm指令等待 - How does the Computer understand time? There's no asm instruction for waiting 在计算机中表示一条指令需要多少位? - How many bits are needed to represent an instruction in a computer? 计算机如何将ASCII字符串转换为有符号或无符号数字? - How does a computer turn a string of ASCII into a signed or unsigned number? 我怎么知道指令访问的数量? - how can I know number of instruction access? 如何设计虚拟cpu/指令集:区分LDA $02和LDA B - How to design a virtual cpu / instruction set: distinguish LDA $02 from LDA B 分支预测如何与指令指针交互 - How does branch prediction interact with the instruction pointer CPU如何知道它的指令集? - How does the CPU know its instruction set? 对于算术运算计算机,可以使用二进制代码或数字的直接二进制等效项,或同时使用两者? - For a arithmetic operation computer can use binary code or direct binary equivalent of a number or both? 解码指令模式 - Pattern of decoding instruction
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM