- #1
Pattonias
- 197
- 0
I have a question for which I can't seem to find a satisfactory answer.
We do a lot of intense real time processing of audio signals in my line of work. (Generally programmed in C)
I am a mechanical engineer with only a rudimentary understanding of the process that goes on in the software for the project but I can't seem to find an answer that really fits our situation.
We will often spec laptops/PCs that we will use for field work, and I will often have computers that I select for approval with newest architecture and high performance overruled with sometimes multi-year old processors that seemingly only have a higher clock rate. 3.1 GHz (2-3 years old) as opposed to 2.5 - 2.8 Ghz (current generation, new architecture).
When performing intense processing is it still purely the clock rate that matters? Why do bench marks show improved performance in the lower clock rate processors with newer architecture, but this does not seem to be important when selecting a PC for custom coded applications?
I guess my question is whether or not there is actually a need, or if the other engineers are following a rule of thumb that discounts the advantages of new architecture.
When trying to research this I found that most of the comments centered around perceived performance advantage of current generation processor on specific metrics such as file transfer or improved game performance that could seemingly be improved with the architecture, but not necessarily be attributed to actual processing being done faster.
We do a lot of intense real time processing of audio signals in my line of work. (Generally programmed in C)
I am a mechanical engineer with only a rudimentary understanding of the process that goes on in the software for the project but I can't seem to find an answer that really fits our situation.
We will often spec laptops/PCs that we will use for field work, and I will often have computers that I select for approval with newest architecture and high performance overruled with sometimes multi-year old processors that seemingly only have a higher clock rate. 3.1 GHz (2-3 years old) as opposed to 2.5 - 2.8 Ghz (current generation, new architecture).
When performing intense processing is it still purely the clock rate that matters? Why do bench marks show improved performance in the lower clock rate processors with newer architecture, but this does not seem to be important when selecting a PC for custom coded applications?
I guess my question is whether or not there is actually a need, or if the other engineers are following a rule of thumb that discounts the advantages of new architecture.
When trying to research this I found that most of the comments centered around perceived performance advantage of current generation processor on specific metrics such as file transfer or improved game performance that could seemingly be improved with the architecture, but not necessarily be attributed to actual processing being done faster.