Meaning of computer architecture

08
2014-07
  • Tim

    I am not a CS student. I have tried many times to understand what computer architecture means, including reading its wikipedia article, but still haven't got clear understanding.

    Besides directly explaining its meaning and purpose, I think it will also be helpful if the following questions can be addressed:

    1. what relations and differences are between computer architecture and organization?
    2. what relations and differences are between computer architecture and operating system?
    3. what relation and differences are between computer architecture and abstract computing models such as Turing machine?
    4. are SISD, MIMD, SIMD and MISD considered computer architectures? Are parallel computing and distributed computing systems considered computer architectures?

    Thanks and regards!

  • Answers
  • Daniel R Hicks

    It's a good question, actually, but probably no satisfying (to the critics) answer. I've got 3-4 books on the shelf here with the term "computer architecture" in the titles, but you'd not find a lot of similarity between them, and over the years I can recall many discussions (arguments) as to what "architecture" means and whether or not the term really can be applied to computers.

    Having been (unofficially) a "computer architect" several times in my career, though, I can probably offer some general thoughts on the topic (and I will, if I don't get too bored first, and if this doesn't get censored for some reason).

    First off, the term "computer architecture" can be applied at several different levels of abstraction. Generally at the lowest level -- gates and flip-flops -- the term "architecture" is not used, since "architecture" itself implies some degree of abstraction. But one level up -- at the level of buses and registers -- the term begins to apply. And it applies at successive levels -- RISC vs CISC, register vs stack, bus vs switch. And, as machines get more and more complex, it applies to additional levels such as memory subsystems, processor arrays, etc.

    This really is very much like how building architecture may be thought of: The basic design of a 100 floor skyscraper may be summarized in a single drawing -- that's architecture. Or the design of the entrance may be expressed in a detailed drawing that is accurate to a fraction of an inch -- architecture again, with many possible layers in-between.

    Getting back to those arguments -- I mean discussions -- we had years ago, the most satisfactory answer I ever heard of the meaning of "architecture" is that it's an expression of the THOUGHT that goes into a structure. There are structures (including both buildings and computers) which appear to not have involved much thought, and the "designs" of these therefore are not expressions of "architecture". An architecture expresses (and to some degree explains and justifies) the thought that went into a design.

    As to the specific questions: 1) I'm not sure what you mean by "organization" 2) Since architecture is expressed at different levels there often is a level which expresses the "union" of the hardware and operating system, and there are even (a precious few) operating systems that can themselves be said to have an "architecture" 3) Generally an abstract computing model is not an "architecture", but an architecture may "reference" an abstract computing model as a way of expressing a design. 4) SISD, MIMD, SIMD, MISD, parallel, distributed are all terms that may be used to describe an architecture, but by themselves (absent the thought that gives real meaning and context to the terms) they are not architectures.

    (Can I give any references to support the above? Probably could dig up one or two, but someone else could dig up contradictory references. It really boils down to opinion and who you believe.)


  • Related Question

    Harvard vs. Von Neumann architecture
  • user32569

    Our teacher told us, that Harvard architecture is the most evolved and produced architecture today and towards future. But I think because of massive overhead of x86 and Von Neumann based ARM systems that actually Von Neumann is the most used architecture today.

    Yes, MCUs with Harvard are even more produced, but since they all have just minor purpose (compared to x86 and ARM based) that Von Neumann is actually the one. Or is it really Harvard?

    And second, I know this is strange question, but does any architecture combining both exists? to have separate memory for data and programs, therefore faster instruction processing, but still able to work with these as Von Neumann? To be able to load and unload programs to program memory on the fly? Isn't this the way the x86 should have go? Or would there be some bottleneck that pure Von Neumann solves? Thanks.


  • Related Answers
  • AndrejaKo

    x86 is the combination of both. If you take a look at L1 cache of modern processors, you'll notice that there is separate cache for data and for instructions. Also, do some digging on Wikipedia about x86. You see, modern x86 processors aren't actually x86. They instead emulate x86 processors by translating x86 instruction into their own internal microcode. In fact on some of the Intel's processors, microcode can be changed while the system is running.

    As for which is more produced, I'd say it's Harvard, mostly because of PICs and similar microcontrollers. As far as I can see, Von Neumann is easier to program so it's not uncommon to see internally Harvard processors (like x86 and some PICs) to present themselves to the world as Von Neumann.