AMD developing reverse Hyper-Threading?


Recommended Posts

From what it seems, it won't be actually doubling the clock speed, however it does seem like the performance would as if the clock speed was doubled, so all is well.

I hope that the CPU/drivers are intelligent enough to distinguish between single and multithreaded apps, and switch "modes" for different apps, giving the best of both worlds.

It would essentially be the same as doubling the clockspeed. AMD's do 9 (i think) operations per cycle, so if you double it to 18 on a 2ghz cpu, that's the same as doing 9 instructions on a 4ghz cpu = 36Gips

I think the problem is that there will be overhead. It's just the same as with multithreaded applications you still don't see two times the performance if you increase the number of CPUs since it doesn't scale linearly.

IBM does something sorta like this in its mainframes. They use two processors to execute the same instruction and then compares the results. [...] Think of it as RAID 1 for CPUs.

Your analogy is quite helpful. While IBM essentially puts the two processors in RAID 1, this new technology under development by AMD would put the processors in a configuration similar to RAID 0. It acts like one big processor, since the load is split amongst them.

Another analogy would be to call this the "DDR" of processors. (No, not Dance Dance Revolution...)

why are you people acting like it will have the same effect as doubling the ghz? this wont give you a 5ghz machine from a dual core 2.5ghz, just like the regular dual cores right now are not "doubling" the speed...this isn't much different, it will speed things up but it wont double the ghz lol

How can you possibly make a claim like that when this is just a rumor and no details about the technology (nor even confirmation of its existance) have been released yet.

Speculation is one thing, but claiming to know what the effect will or won't be is stupid.

This isn't anything new or innovative on AMD's part. Intel has been working on a similar technology.

Intel Mitosis: http://www.intel.com/technology/magazine/r...eading-1205.htm

Thanks for the link, this really is a good reading for me...

It seems Intel researchers put a really good concept.

Very early benchmarks shows more than 2x performance increase at the cost of power - no performance penalty since either way a single thread will only use one core which i find very cool.

Though everthing seems fine, one sentence just killed the joy:

http://www.anandtech.com/printarticle.aspx?i=2507 (Scroll down the page7)

"As you can see, offering in many cases a 2 - 3x performance improvement is nothing short of impressive. But keep in mind, this project is in its very early stages of research and as promising as this looks, it may take 5 - 10 years for the research to make its way into the real world."

So i don't believe any time soon - like H2 2007 - AMD can release a new CPU that uses a similar technique like Intel's Mitosis.

I agree that this probably won't double the speed of processors. Just take a look at SLi, it was something like up to 180% faster? I'm sure emulating two cores as one wouldn't double the speed, but we would see a great improvement.

(Before this post I blantly post that I am a Intel fanboy. That being said...: )

This is stupid because it then wouldnt push developers to write applications that are used by multicore processors. This would be like getting NTFS to work on Windows 3.11; Going to the past.

I am completely against this from both AMD's side and Intel's side to implant this tech. So all you AMD fans are doing is making the computer world worse by supporting this...

Definitely innovative and intriguing but i honestly don't think it will be that important. All the really important and cpu intensive software from the big companies will be rewritten to take advantage of multiple processors if they haven't already...windows, photoshop, video editing software, games etc.

as for more "amateur" coders who don't understand how to do parallel programming or don't want to bother, they aren't going to be making massive programs anyways. single threaded programs will always be out there but things like audio players, desktop widget software blah blah blah don't need to run on multiple cores. they're small, simple programs that will run fine even on an old pentium.

Definitely innovative and intriguing but i honestly don't think it will be that important. All the really important and cpu intensive software from the big companies will be rewritten to take advantage of multiple processors if they haven't already...windows, photoshop, video editing software, games etc.

as for more "amateur" coders who don't understand how to do parallel programming or don't want to bother, they aren't going to be making massive programs anyways. single threaded programs will always be out there but things like audio players, desktop widget software blah blah blah don't need to run on multiple cores. they're small, simple programs that will run fine even on an old pentium.

Video and photo editing software will be (hell, already are) mostly multi-threaded; the problem is with games. Games are still heavily single-threaded, and will be for a long time. Current and upcoming game engines are not being designed to make use of more than one processing core, in which case this will greatly benefit AMD's new architecture.

Wont need to multi thread apps then. This would be really awesome.

Now that is completely untrue. While yes, in theory something like this could make single-threaded apps perform better on multi-processor machines, it has absolutely no effect on the use (nay, necessity) for multi-threading in a great many situations.

One obvious example... If you don't want your UI thread blocking on background processing, you're going to want a multi-threaded app.

this isn?t good news because almost everybody has or his gettings dual cores and when some program that many use that dont use both cores theirs foruns is full of requests for doing that. with this move from amd will make programers dont bother

this is a regress, not a evolution.

the ideia is great, as logic but isn?t good because we will suck on time for this

btw, how about windows vista? is almost all made for use dual cores, what this tecnoly will do on Vista because vista will be doing the dual and you ran one program that isnt dual (so making this tecnoly active) ? will cause some dll erros on windows on minimium for sure

this isn?t good news because almost everybody has or his gettings dual cores and when some program that many use that dont use both cores theirs foruns is full of requests for doing that. with this move from amd will make programers dont bother

this is a regress, not a evolution.

the ideia is great, as logic but isn?t good because we will suck on time for this

btw, how about windows vista? is almost all made for use dual cores, what this tecnoly will do on Vista because vista will be doing the dual and you ran one program that isnt dual (so making this tecnoly active) ? will cause some dll erros on windows on minimium for sure

You're talking out of your ass you know.

A Multi-Core CPU that can combine its physical cores into one big virtual processor would be very interesting when the programs you're running at very cpu intensive like games and compiling while programming.

You're assuming that the cpu, which is multi-core, would not be able to act as amulit-core and that would always show as one powerful cpu but thats wrong. Stop assuming and wait for technical information before spreading bullcrap of dll errors lo:rofl:l:

Genius! :D

I wouldn't call it genius. More like evolution. Eventually, someone would come up with this idea. It's not that unexpected; it's just hard to think of how it will be done.

Imagine a Dual Core FX-60 2.6ghz running in 'single core mode' at 5.2ghz ! :cool:

Umm, no. This doesn't mean dual-core CPU speeds will double. This means one thread can be spread across two cores. The performance gain has yet to be seen, or even accurately calculated/estimated.

Why didnt anyone think of this before? :)

Someone probably already did. The hard part is how do you impliment it? This is not a simple task by any means.

Edited by John
This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.