PDA

View Full Version : AGEIA (Creator of PhysX) has fully been absorbed



crenn
05-13-2008, 09:49 PM
Yep, that's right, AGEIA no longer has a website. Trying to go to http://www.ageia.com/ will result in an automatic link to this page:
http://www.nvidia.com/object/nvidia_physx.html

I found this out via this page:
http://www.nvidia.com/Download/index.aspx?lang=en-us
Just take a look down in other drivers and you'll find this:
NVIDIA PhysX System Software
Note that when you click on it, it has this in it's description:

Does NOT include NVIDIA GPU PhysX Support

I can't find much information on it, but it seems that NVidia is keeping it's word..... GPU PhysX is coming.

The boy 4rm oz
05-13-2008, 10:00 PM
Yeah it was a matter of time until the Ageia site went down.



I can't find much information on it, but it seems that Nvidia is keeping it's word..... GPU PhysX is coming.

Actually it isn't as new as I thought. I have recently got FEAR back from a friend and have been playing with it for a bit, tweaking settings and came across something interesting. Under one of the Advanced system settings tabs there are two sliders. 1 allows you to allocate a certain percentage of your GPU to physics calculations and the other slider allows you to do the same with the RAM (may also e CPU as well). After playing with this you really have to cut down the other settings to get good frame rates, even with GPU only at 25%.

It will be interesting to see how Nvidia deal with this. I may end up getting a cheap 88** GPU (88** series and above support this feature only I think) and dedicate it completely to physics, or just get a physics card.

crenn
05-13-2008, 10:20 PM
There won't be any physic specific cards. Any 8 series or higher should support CUDA. A higher end card may be better though. I'd wait until GPU PhysX is released before deciding. I think a dual card approach will yield better results.

I never knew about that, I'll have to borrow my friend's computer and check it out.

The boy 4rm oz
05-13-2008, 10:35 PM
I know there won't be any physics specific cards but what I am saying is I would rather still have one good card for graphics and a lower speced card for the physics, I was thinking maybe an 8800GT or if the entire 8*** series are supported an 8500.

J-Roc
05-14-2008, 08:34 PM
I know there won't be any physics specific cards but what I am saying is I would rather still have one good card for graphics and a lower speced card for the physics, I was thinking maybe an 8800GT or if the entire 8*** series are supported an 8500.

I forsee one major problem with that approach. The second card wont be enabled unless its in SLI mode which means you'll need 2 identical cards. Also, the 88 series doesnt support physics in the first place(as far as i know). Nvidia would have to do some serious driver coding to get physics computing on a card that never had the abillity to do it to begin with. Look for Nvidia Quantum Effects cappable cards.

crenn
05-14-2008, 09:04 PM
J-Roc, all 8 series cards are CUDA enabled. That's how they're doing it. You can have 2 cards in your computer, but at this stage, it's too early to see requirements.

The boy 4rm oz
05-15-2008, 12:45 AM
I'm probably just going to get a PhysX card, will be a lot easier to set up and probably have more performance gains. With Ageia getting taken over the prices should drop. T

crenn
05-15-2008, 03:52 AM
WRONG!

I'd wait for the GPU PhysX to come out, in theory it should be a lot more powerful than the normal PhysX card.

The boy 4rm oz
05-15-2008, 08:18 AM
I will wait to see a comparison but I still can't see it being better than an independent card solution. Yes it may be more powerful but will the rest of your visuals suffer from the extra usage of the GPU? I'm not doing to down the sliders on all my games just so I can see some boxes move differently.

Trace
05-15-2008, 09:09 PM
Lol, "Just to see some boxes move differently"

Thats going in to my sig

The boy 4rm oz
05-16-2008, 03:26 AM
Lol well yeah that's all it really will do. Take strain off the CPU so boxes and explosions will be graded by impact force, angle and objects.

crenn
05-16-2008, 05:02 AM
The reason the CPU isn't fully suited for it is because of how many calculations are needed.

The boy 4rm oz
05-16-2008, 05:56 AM
And the fact that it runs everything else.

crenn
05-16-2008, 09:01 AM
Believe me, I have a quad core and a heavy PhysX level uses 75% of it... there is more room.

The boy 4rm oz
05-16-2008, 09:24 AM
Yeah I have the same CPU as you.

AlphaTeam
05-17-2008, 11:38 AM
I don't like the PhysX card. Bought it the first day it came out because I loved the Cell Factor demo. I have to say that game is pretty amazing.

The reason I didn't like card is because 1) my computer wasn't powerful enough, 2) there were 2 games supported, 3) it was kind of expensive and the results were great, but not enough. None of my friends ever bought the card, so I never got to use it. It's still sitting in the original packaging. Looks like it'll never be used.

Quakken
05-18-2008, 08:32 PM
It's all about adoption on things like this. If they can get a large amounts of games on board at rollout, then it will be adopted by fans and games will have amazing physics, more companies will release bigger and better physics cards and every gamer will benefit with more realistic/visually appealing physically games.

They weren't, so they didn't, and the fans won't.

Sad.

crenn
06-13-2008, 09:05 AM
Furthermore, almost all of the games on show were demonstrating the use of PhysX physics acceleration using the new GPU from NVidia - we're in for another revolution people!


...the Ageia team believes a PhysX driver for the new GeForce GPU will be finished and available between late June and early July. Whether this also includes the 8- and 9- series GPUs that are CUDA capable was not entirely clear, though I assume so - we'll have to wait and see.

Quotes are from a computer magazine I read.

The boy 4rm oz
06-13-2008, 09:24 AM
Hmmm.

crenn
06-13-2008, 09:28 AM
What's that "Hmmm." about?

EDIT:
Confirmed - http://www.vrforums.com/showthread.php?t=288295

The boy 4rm oz
06-13-2008, 11:00 AM
"we're in for another revolution people!"

Yes we are, but I can see it not being all it's cracked up to be. I personally think that it will only really benefit high end (GTX/ULTRA) systems or systems running SLI.

Tavarin
06-13-2008, 11:04 AM
My biggest question is will separate physics cards work with ATI GPUs. If nVidia makes this an nVidia only compatible product I will be pissed.

The boy 4rm oz
06-13-2008, 11:11 AM
The other question is WILL it be a seperate card dedicated to physics (like a standard PhysX card) or will you give up part of your current card/s to gain physics effects while most likely losing frame rates. CUDA is Nvidia technology so I would say it will probably stay as an Nvidia feature to start with. They will most likely sell off licenses to AMD further down the line.

We just have to wait and see. I think it is a good idea based on the principle. The way Nvidia implement it way be its downfall however. If the driver is actually released for th 88** series cards I will definitely try it.

nevermind1534
06-13-2008, 11:50 AM
I think they're going to cease production of seperate cards. The drivers for the gf8 and up will support it with no additional hardware.

crenn
06-13-2008, 06:42 PM
From what I've heard, ATi haven't ruled out putting PhysX on their GPUs but currently they're working with Havok (who is owned by Intel). PhysX is 'open' still, so ATi could put it on their GPUs.
As far as I know, the GTX280 has a PhysX co-processor on the card....

The boy 4rm oz
06-13-2008, 11:37 PM
As far as I know, the GTX280 has a PhysX co-processor on the card....

Which should performs much better than dedicating some of the GPU clock to physics calculations.

crenn
06-20-2008, 05:28 AM
Early PhysX drivers are out in the wild:

http://forums.vr-zone.com/showthread.php?t=291319

The boy 4rm oz
06-20-2008, 05:35 AM
I will have to try it.

EDIT
*QUOTE*
The GeForce 177.39 driver for the 64bit version of Windows Vista is dated June 16, and brings PhysX support for the GeForce 9800 GTX, GeForce GTX 260 and 280 graphics cards and the nForce 750a SLI and 780a SLI IGPs.
*QUOTE*

Looks like I miss out, no 8800GTX support.

crenn
06-20-2008, 06:41 AM
That hasn't stopped others!

If you have the 177.35 drivers and the new PhysX drivers, you should be able to run it. All I'll say is this looks very promising!

The boy 4rm oz
06-20-2008, 07:50 AM
I may try it later. These are the drivers I am running according to drive manager (yes I haven't updated in a while) 7.15.11.6369. I will update and try later this weekend.

crenn
06-20-2008, 10:44 AM
It seems you need a dual card solution for this to work well. Performance hit with a single card solution (in games) is quite bad apparently. Any one in Melbourne with a motherboard with dual PCIE 16x slots and a ATi card? I want to try something out...

The boy 4rm oz
06-21-2008, 03:26 AM
Just as I thought, the physics is too taxing on a single GPU. I still like the idea of a dedicated card, I may just get a PhysX card depending on how the on GPU physics turns out.

crenn
06-21-2008, 03:29 AM
There have been mixed reports.... I guess we'll have to wait until TT or VRZone look into it.

The boy 4rm oz
06-21-2008, 03:43 AM
We'll just have to wait and see.

crenn
06-21-2008, 08:11 AM
All I know is, it seems possible to have a NVidia card running PhysX and an ATi card running graphics. Could this mean NVidia may release a card with no ports but with a GPU core?

Crazy Buddhist
06-21-2008, 09:15 AM
EDIT. After posting this I found this interview (http://www.firingsquad.com/features/ageia_physx_acquisition_interview/) with Derek Perez, NVIDIA's Head of Public relations. the quotes below in red are things he says that support my earlier conclusions.

From Nvidia;

"Delivering physics in games is no easy task. It's an extremely compute-intensive environment based on a unique set of physics algorithms that require tremendous amounts of simultaneous mathematical and logical calculations.

This is where NVIDIA® PhysX™ Technology and GeForce® processors come in. NVIDIA PhysX is a powerful physics engine which enables real-time physics in leading edge PC and console games. PhysX software is widely adopted by over 150 games, is used by more than 10,000 registered users and is supported on Sony Playstation 3, Microsoft Xbox 360, Nintendo Wii and PC.

In addition, PhysX is designed specifically for hardware acceleration by powerful processors with hundreds of cores. Combined with the tremendous parallel processing capability of the GPU, PhysX will provide an exponential increase in physics processing power and will take gaming to a new level delivering rich, immersive physical gaming environments with features such as:

* Explosions that cause dust and collateral debris
* Characters with complex, jointed geometries for more life-like motion and interaction
* Spectacular new weapons with incredible effects
* Cloth that drapes and tears naturally
* Dense smoke & fog that billow around objects in motion

The only way to get real physics with the scale, sophistication, fidelity and level of interactivity that dramatically alters your entertainment experience will be with one of the millions of NVIDIA PhysX-ready GeForce processors.*

*Note: NVIDIA will deploy PhysX on CUDA-enabled GPUs later this year. The exact models and availability will be announced in the near future."


~~~~~~~~~~~~~~

Couple of points:

1. The move will be towards no separate PhysX cards, although currently still available - PhysX algorithms are being incorporated into the NVidia drivers. The GPU has plenty of stream processors that are perfectly suited to PhysX work - much more so than the CPU. With new driver releases your 8 or 9 series Nvidia GPU will become more and more PhysX enabled in and of itself.


No separate PhysX cards:
"Physics is a natural for processing on the GPU"
FiringSquad: "Are there any plans to release new stand alone PhysX graphics chips or will AGEIA's technology be integrated into new NVIDIA GeForce graphics chips or nForce motherboards?"
Derek Perez:: "We have no announcements at this time." <------- WOOOOO Look for the lie "The only way to get real physics with the scale, sophistication, fidelity and level of interactivity that dramatically alters your entertainment experience will be with one of the millions of NVIDIA PhysX-ready GeForce processors.*" - from the above Nvidia release.

"we are going to enable GPU Physics as soon as possible."

"AGEIA designed its multithreaded PhysX software specifically for hardware acceleration in massively parallel environments. NVIDIA’s GPUs are well-suited to take advantage of PhysX software and AGEIA’s expertise".


2. Backwards roll out for 8 and 9 series GPU's is due this year. Exactly which models will be supported yet to be confirmed but it looks like perhaps all 8 and 9 series cards (i.e.all CUDA capable cards which implies they are porting the PhysX algorithms straight into the CUDA language and injecting it into their forceware).

3. "The only way to get real physics with the scale, sophistication, fidelity and level of interactivity that dramatically alters your entertainment experience will be with one of the millions of NVIDIA PhysX-ready GeForce processors." I do not think Nvidia will be making it easy for ATI to use this technology. However it may well be possible to use an Nvidia card dedicated to the PhysX work and an ATI card for the actual rendering and graphics output.

4. As to releasing a card with no ports but a GPU I think it is unlikely. If you think about it since the day SLI was released there has been a VERY strong argument for such cards which would offer the scaling of SLI at a lower cost as the second card would have a simpler production, layout, less materials etc. But it hasn't been done. I think Nvidia are moving away from separate PhysX cards with this move and I would be surprised if the AGEIA cards do not come off the shelves relatively quickly. What would an Nvidia GPU with no ports and PhysX incorporated be? It would be a PhysX card and I suspect these are on the way out.

6. I believe with CUDA development, buying AGEIA and some of Nvidia's other recent moves they are seeing a vast change in the nature of PC architecture on the horizon and are discerning a hard-fight period for the industry - and are therefore working towards positioning themselves to best take advantage of both the new developments and the inevitable industry shake ups.

Vast change in the nature of PC architecture on the horizon:
"Second, the computer industry is moving towards a heterogeneous computing model, combining a flexible CPU and a massively parallel processor like the GPU to perform computationally intensive applications like real-time computer graphics. Physics is a natural for processing on the GPU because, like graphics, it is made up of thousands of parallel computations, and with our CUDA technology, which is rapidly becoming one of the most pervasive parallel computing programming environments in history, we can open this exciting parallel processing world to applications desperate for a giant step in computing performance—such as physics processing, computer vision, video/image processing, and a world of exciting applications we’ve not yet imagined."

When a university can build a CUDA based supercomputer based around 4 x 9800 GX2 graphic cards for a total cost of $4,000, and which in some tests outperforms the $13.5 Million 1000 CPU cluster they built a couple of years ago, you KNOW there is major change in the air.

Crazy


}
:rant
print RandomStrings$
--$ IF brain == null THEN END ELSE GOTO rant {{$

EDIT2 HAHAHA Funny "Interview" (http://www.techarp.com/showarticle.aspx?artno=309&pgno=0)

"Tech ARP : But will this mean an increase in price for NVIDIA graphics cards that come with the PhysX processor? Or will NVIDIA continue to produce cheaper cards without the PhysX processor?

NVIDIA : Not at all. We not only intend to integrate the PhysX processor for free, we will now price our cards at least 20% cheaper than ATI's cards. If we have a vision for the next 5 years, it's to drive ATI into the ground, once and for all. When that happens, maybe we will buy those boys out and then we will be at the top of the telephone books too."

Anarchist
06-21-2008, 09:58 PM
NVIDIA : Not at all. We not only intend to integrate the PhysX processor for free, we will now price our cards at least 20% cheaper than ATI's cards. If we have a vision for the next 5 years, it's to drive ATI into the ground, once and for all. When that happens, maybe we will buy those boys out and then we will be at the top of the telephone books too."[/COLOR]

That wasn't a real interview. ;)

Nvidia have no intention of buying ATI, as it would mean the following:
a.) AMD (which they would need to buy also) would probably lose their x86 license due to Nvidia not being based in the US which is why AMD managed to get that license, which means their CPUs would be in no position to be used in Windows based desktops.
b.) Nvidia has explicitly said before that they'd prefer to buy AMD as a processor firm, but wouldn't want the rest of it's 'baggage' - e.g. ATI.
c.) They'd be fighting a war on three fronts rather than one if they did manage to get the x86 license to stay in their hands: Chipsets, Graphics and Processors, which is part of the reason why AMD is in so much trouble.
d.) AMD is currently making a loss; the only part making a profit is from ATI, so they couldn't afford to sell ATI on it's own as they'd soon become bankrupt (even with the money made from selling ATI, they'd probably not have enough money to guarantee the ness. research.)
e.) Intel is making their own dedicated graphics solution anyway, so they'd be back to square one as soon as that was released (1 company vs 1 company) and Intel has a much larger amount of money to research etc.
f.) AMD would probably have much less of a claim of 'monopoly' against Intel due to becoming part of a larger company; this would allow Intel to use more aggressive business tactics without getting into trouble.

Crazy Buddhist
06-21-2008, 11:55 PM
That wasn't a real interview. ;)

Blimey you ARE clever. I didn't spot that. If i had I would have said something like 'EDIT2 HAHAHA Funny "Interview" (http://www.techarp.com/showarticle.aspx?artno=309&pgno=0)' putting the word "interview" in quotationn marks for emphasis. Call me an Englishman and rub my nose in a ducks bottom.


Nvidia have no intention of buying ATI, as it would mean the following:

I did not, anywhere, say, propose or imply they did.


a.) AMD (which they would need to buy also) would probably lose their x86 license due to Nvidia not being based in the US which is why AMD managed to get that license, which means their CPUs would be in no position to be used in Windows based desktops.
b.) Nvidia has explicitly said before that they'd prefer to buy AMD as a processor firm, but wouldn't want the rest of it's 'baggage' - e.g. ATI.
c.) They'd be fighting a war on three fronts rather than one if they did manage to get the x86 license to stay in their hands: Chipsets, Graphics and Processors, which is part of the reason why AMD is in so much trouble.
d.) AMD is currently making a loss; the only part making a profit is from ATI, so they couldn't afford to sell ATI on it's own as they'd soon become bankrupt (even with the money made from selling ATI, they'd probably not have enough money to guarantee the ness. research.)
e.) Intel is making their own dedicated graphics solution anyway, so they'd be back to square one as soon as that was released (1 company vs 1 company) and Intel has a much larger amount of money to research etc.
f.) AMD would probably have much less of a claim of 'monopoly' against Intel due to becoming part of a larger company; this would allow Intel to use more aggressive business tactics without getting into trouble.


Blah blah blah .... a) is rubbbish. b) see below c) boy you really missed the point of what I wrote d) rubbish e) yes we know this f) rubbish ...

Nvidia won't buy AMD because they would also be buying the only other significant GPU business as part of the deal and would have a 90%+ monopoloy in the Graphics market through AMD's ownership of ATI. They would then be forced to sell the ATI business by both US and European monopoly authorities and there is no obvious buyer.

In this other thread (http://www.thebestcasescenario.com/forum/showthread.php?t=15096) I predicted something about current trends in the market. What I predicted in that thread is that Nvidia will make more integrated "systsem on a die" computers, that that is where the industry as a whole is headed and that the coming war in the industry will be Nvidia vs AMD and Intel.

Crazy.

Anarchist
06-22-2008, 12:46 AM
They would then be forced to sell the ATI business by both US and European monopoly authorities and there is no obvious buyer.

Completely wrong. :rolleyes:
The majority of the graphics market, even without releasing their dedicated graphics solution yet is controlled by Intel and both courts in the US and Europe would recognize this. As long as people were able to run desktop computers, with a graphics solution, capable of running modern operating systems, without having to use dedicated graphics they would not risk monopoly claims - for example, you CAN play pretty much every game on Intel integrated graphics, it's just not that great of an option. :rolleyes:

Look up the terms of the Intel/AMD x86 license agreement, you will find a is right. The agreement is not valid overseas. For d. - "The chipmaker's net loss for the quarter was $358 million" http://www.ecommercetimes.com/story/62648.html Loss was greater in last quarter of 2007.

f I believe you've just contradicted yourself. ;)

The boy 4rm oz
06-22-2008, 01:32 AM
One GIANT head f*** for me lol.

Crazy Buddhist
06-22-2008, 02:27 AM
Completely wrong. :rolleyes:

... they would not risk monopoly claim ...

Look up the terms of the Intel/AMD x86 license agreement, you will find a is right. The agreement is not valid overseas.

Nope you are wrong sorry - on both counts.

The problem with the licence agreement has nothing to do with "not valid overseas". All legal agreements and contracts, including licence agreements, are created/signed under a particular legal jurisdiction. "This agreement is made under the Laws of Delaware US" for example merely means that any problems arising under the contract will be dealt with under the laws and courts of Delaware.

If the license "is not valid overseas" AMD would only be able to sell their processors in the US. The licence clearly contains contractual arrangements that allow AMD to sell their x86 kit anywhere in the world.

EDIT: from the licence "Agreement, Intel hereby grants to AMD a non-exclusive, non-transferable ***** worldwide license")

However ... any contractual problem anywhere in the world under that licence will be dealt with in a Delaware court - or wherever they have specified. (EDIT: I guessed Delaware because fat corporates love the laws there .. guess what? AMD and Intel are both Delaware corporations haha)


There IS a licence problem if Nvidia were to buy AMD. It is a different one. It is believed the wide cross marketing and licencing agreements between Intel and AMD do not allow AMD to transfer intel technology to any third party. EDIT: further research reveals the real culprit to be one of the termination clauses in the contract between them:

(7) the other party undergoes a Change of Control. For
purposes of this Section 6.2(b)(7), "Change of Control"
shall mean a transaction or a series of related
transactions in which (i) one or more related parties
who did not previously own at least a fifty percent
(50%) interest in a party to this Agreement obtain at
least a fifty percent (50%) interest in such party, and,
in the reasonable business judgment of the other party
to this Agreement, such change in ownership will have a
material effect on the other party's business, or (ii) a
party acquires, by merger, acquisition of assets or
otherwise, all or any portion of another legal entity
such that either the assets or market value of such
party after the close of such transaction are greater
than one and one third (1 1/3) of the assets or market
value of such party prior to such transaction.

From X-Bit Labs: (http://www.xbitlabs.com/news/cpu/display/20080214105804_Analyst_Expects_Nvidia_to_Acquire_A MD_Despite_of_Chances_to_Lose_x86_License.html)

"Perhaps, Jen-Hsun Huang, the chief executive of Nvidia, could re-architect Advanced Micro Devices in order to make it profitable. However, due to the fact that wide cross-licensing agreement between AMD and Intel (http://contracts.corporate.findlaw.com/agreements/amd/intel.license.2001.01.01.html), which is also believed to cover x86 instruction set, does not allow AMD to transfer any of Intel’s technologies to any third-party. As a result, if AMD is acquired by Nvidia, the new company will not have rights to produce x86 central processing units (CPUs) or utilize any technologies from Intel.

It is uncertain whether Nvidia, or any other company that has no wide cross-licensing agreement with Intel that covers x86 instruction set, is interested in getting AMD and not interested in making CPUs. But what is almost certain is that various antitrust organizations would be against the two suppliers of discrete GPUs becoming one."

EDIT PS I used to be a Merchant Banker by trade and was often involved in drawing up contracts of this type. The publicly released version of this one is hillarious because AMD and INTEL applied for so many passages to be hidden for confidentiality reasons. This is a lovely example:

(c) The parties represent, warrant and covenant that they shall not *****.

Crazy Buddhist
06-22-2008, 03:43 AM
4. As to releasing a card with no ports but a GPU I think it is unlikely. If you think about it since the day SLI was released there has been a VERY strong argument for such cards which would offer the scaling of SLI at a lower cost as the second card would have a simpler production, layout, less materials etc. But it hasn't been done. I think Nvidia are moving away from separate PhysX cards with this move and I would be surprised if the AGEIA cards do not come off the shelves relatively quickly. What would an Nvidia GPU with no ports and PhysX incorporated be? It would be a PhysX card and I suspect these are on the way out.


How wrong. Top end they ARE doing this for research boffins and the like only they call it a "Computing Processor" elsewhere they slip in the "GPU" a la Freud and call it a GPU Computing Processor"....

Tesla C1060 Computing Processor: (http://www.nvidia.com/object/tesla_c1060.html)


http://www.nvidia.com/docs/IO/55026/prod_shot_initial_tesla_c1060.jpg

The boy 4rm oz
06-22-2008, 04:18 AM
These have been on their site for a while now.

crenn
06-24-2008, 06:05 AM
It seems you need a dual card solution for this to work well. Performance hit with a single card solution (in games) is quite bad apparently. Any one in Melbourne with a motherboard with dual PCIE 16x slots and a ATi card? I want to try something out...

Well, a lot of people aren't happy about Vantage getting a boost from an NVidia GPU running PhysX, and that's something that remains to be seen. But the Performance hit I heard about with a single card solution... well, things got interesting.

Someone with 8800GT used a modified INF (so they could use the drivers with their card) tried the PhysX levels in UT3. Here's what has stemmed from that.


Quite afew people including myself which have 512mb gt's or gts's (g92) seem to get this weird fps bug. Both levels take around 2-3mins to load then they start out all fine and dandy (30-60fps) then after a minute or so they nose dive to under 10fps like someone switched of the PhysX support? :S

Just a quick follow up regarding PhysX in UT3 and the slow downs...
http://www.theinquirer.net/gb/inquirer/news/2008/06/23/nvidia-cheats-3dmark-177
Copying the file mentioned and i dont get the stuck FPS anymore http://futuremark.yougamers.com/forum/images/smilies/thumbup.gif

They're refering to this:
http://images.vnu.net/gb/inquirer/news/2008/06/23/nvidia-cheats-3dmark-177/nvidia_ppu_ut3.jpg

This is certainly something to watch, seems that even single card solutions will be able to do this well!

EDIT: Found from this - http://futuremark.yougamers.com/forum/showthread.php?t=83625

The boy 4rm oz
06-24-2008, 06:22 AM
Thanks for that crenn. I still haven't tried it with my 8800GTX, I will download the driver now.

crenn
06-24-2008, 06:24 AM
It doesn't work with anything but the GT200 and G92 cores currently. Remember, that driver isn't an actual BETA driver, but it's basically being treated as one. You'll have to wait sorry.

The boy 4rm oz
06-24-2008, 06:42 AM
Still worth the download. You never know. Anyway I can always roll back the driver if I am getting issues.

crenn
06-24-2008, 06:57 AM
Just beware, there is an issue in Crysis.

The boy 4rm oz
06-24-2008, 08:14 AM
Yeah I heard about that. However I don't have Crysis, I do have the demo for UT3.

crenn
06-24-2008, 08:38 AM
Even with the 'normal' drivers, you could run the UT3 demo fine. It's when you have the PhysX levels that it slows down a lot.

The boy 4rm oz
06-24-2008, 08:41 AM
I know I can run the demo fine, I got it when it was released. I just wanna see if there are any frame rate differences

crenn
06-24-2008, 09:22 AM
Some people are reporting lower framerates, however, some people are reporting lower temps as well.... I can't wait for PhysX to be officially released.

The boy 4rm oz
06-24-2008, 10:00 AM
Neither can I, now I have seen that it's not just a sales ploy. Good to see that there have been some new games sign up for the PhysX bandwagon also.

crenn
06-24-2008, 10:17 AM
Yeah, but Futuremark is going to have problems as Nvidia have 'cheated' by releasing GPU PhysX.

The boy 4rm oz
06-24-2008, 10:39 AM
They will probably release an update which will counteract the effect of the Physics in tests, or make them less noticeable. They had the same issue with the quad core Intel processors, there was a huge increase in points when they were first released until they patched it up.

crenn
06-24-2008, 10:54 AM
Futuremark has to do something to keep the masses happy.

crenn
06-25-2008, 08:01 PM
A little birdie told us that ATi might be getting PhysX as well.

http://www.overclock3d.net/news.php?/gpu_displays/ati_falls_for_physx/1

EDIT: Interesting read -
http://www.tgdaily.com/content/view/38121/128/

The boy 4rm oz
06-26-2008, 12:04 AM
I just thought of something. If you already have a PhysX card and you get a GTX280 would both cards calculate physics or would it dedicate it to the PhysX card or the GPU SLI PhysX lol.

Your right crenn, that was an interesting read.

crenn
06-26-2008, 12:16 AM
I think you select a card and let it handle it. However, I see that the high end models will process PhysX very quickly. The actual PhysX driver that enables GPU physX on some cards is finalised.

EDIT: Another interesting article:

http://www.hothardware.com/News/NVIDIA_Responds_To_GPU_PhysX_Cheating_Allegation/

Also shows that PhysX work offloading for the CPU test can be disabled for the GPU and PPU, and make it run on the CPU.

crenn
06-26-2008, 12:37 PM
This deserves it's own post.

ATi Cards are fully capable of running PhysX.... here's one someone prepared recently!
http://forums.vr-zone.com/showthread.php?t=293936

crenn
07-04-2008, 02:20 AM
NVidia supporting PhysX on ATi cards? You decide:
http://www.ngohq.com/news/14254-physx-gpu-acceleration-radeon-update.html

Things are getting interesting, if it's true, it's a very clever strategy from NVidia. Don't understand why? Think about it this way, the larger the hardware support for PhysX is, the more developers will consider it for games, and that means NVidia can potentially sell licenses for PhysX.

The boy 4rm oz
07-04-2008, 04:38 AM
I never got around to trying those drivers. I downloaded them though. I am heading over tho Melbourne next week. I may see if I can pick up a cheapo PhysX card while I am over there.

crenn
07-04-2008, 04:58 AM
Don't get a PhysX card, you can get a 8800GT cheaper xD

The boy 4rm oz
07-04-2008, 05:07 AM
I already have a GTX though lol. However I did see an Ex-Demo 8800GTX for $299, I was thinking of getting it but then I would need a new PSU lol. 550W Antec Neo doesn't have the juice, or the connectors haha.

crenn
07-04-2008, 07:40 AM
You can use a 8800GT as a dedicated PhysX card and nothing else (you can select what GPU)

The boy 4rm oz
07-04-2008, 09:12 AM
HELL YEAH. Still don't have a spare connector on my PSU, nor the required wattage.

crenn
07-04-2008, 09:59 AM
Hehehe, time for an upgrade!

nevermind1534
07-04-2008, 11:58 AM
I messed up direct3d by installing those leaked physx drivers for my 8800GT. I'd sugeest jut waiting until they're actually released by nvidia. Also, when I reinstalled the nvidia stuff, I looked, and now I can't overclock or underclock my card, monitor the temps, or run a stress test. Does anybody know how to unlock this again?

The boy 4rm oz
07-04-2008, 12:36 PM
CoolBits mod. It is a registry mod which unlocks overclocking through the Nvidia COntrol Panel.

crenn
07-23-2008, 11:33 PM
For you NVidia/PhysX fans out there, put August 5th on your calender!

http://forums.vr-zone.com/showthread.php?t=305153

The boy 4rm oz
07-24-2008, 02:35 AM
Oh I just saw that over at Tech Power Up lol, you beat me crenn lol.

crenn
07-24-2008, 02:50 AM
Naturally, currently I don't have a life ^-^;

The boy 4rm oz
07-24-2008, 02:57 AM
HAHA poor you.