Inno3D Tornado GeForce4 MX440

by Len "Viking1" Hjalmarson

Article Type: Review
Article Date: March 01, 2002

Product Info

Product Name: Inno3D Tornadao GeForce4 MX440
Category: Video Card
Manufacturer: InnoVision
Release Date: Released
Files & Links: Click Here

* * *




NVIDIA's Solutions

Two years ago NVIDIA introduced their GeForce line of video accelerators with integrated transformation and lighting (T&L).

The world had not seen a 3D accelerator with hardware T&L functions since the days of Rendition. But this was not those days, and the new hardware was many times more powerful than Rendition's chip. NVIDIA rightly named their new hardware a , since it rivaled the CPU in power and was doing an equal amount of work. The gaming world has not been the same since.

The first of the GeForce generation boards sported either 16 or 32MB of main memory. GeForce2 GTS arrived not long after, and improved performance on the same system by at least 33 percent. The GeForce2 series topped out with the GeForce2 Ultra, with 64MB of main memory and running around 5000 3DMarks (2001) on a 1 GHz class CPU.

GeForce3 arrived in 2001 with a great deal of fanfare. As the marketing wheels spun around, virtual pilots were asking tough questions. What use are all these futuristic features when my current games and even ones in late stages of development won’t make use of them? Why should I pay the price premium instead of buying into the last generation?

The answer, of course, was buying for the future. If you are in a position to upgrade and you don’t want to upgrade again next year, it makes sense to invest in technology that will carry you further. (As if any flight simulation fan kept his graphics card for more than twelve months anyway!)

The programmable pixel shaders and vertex shaders of GeForce3 and DirectX 8 fame do indeed offer developers greater flexibility than they have ever had in the past. Furthermore, antialiasing became a standard rather than an extra.

Unfortunately, the GeForce3 generation was not quite up to the challenge of antialiasing while supplying high resolutions and 32-bit color on our modern flight simulations. Not to worry, NVIDIA had the solution in mind all along.



GeForce4



The Inno3D Tornado GF4 MX440

On February 6th in San Francisco NVIDIA launched the latest technology in the form of the GeForce4 line of accelerators. Arriving in at least four major forms, the GeForce4 will appear in the flagship Ti4600 with 128MB, the middle of the line Ti4400 with 128MB, the entry level Ti4200 and the GeForce4 MX with several variants (mostly built around slower memory).

Clock speeds are up from the GeForce3 Ti line, with the flagship Ti4600 product set at 300 MHz compared to the 240 MHz of the GeForce3 Ti500. Memory speed has increased to 325 MHz from 240, resulting in a DDRAM speed of 650 MHz. Whew, I wish my motherboard memory was running in that neighborhood.

Meanwhile, memory bandwidth is up more than 25 percent. It’s interesting that in-game tests place the average frame rate gain in the same ballpark. Memory bandwidth is critical with every improvement in performance. In previous generations of 3D hardware, the bottlenecks were routinely found in the memory bandwidth.

The flagship product Ti4600 doubles the raw memory of GeForce3 to 128MB. That is one whopping amount of video memory, but with prices the like of which we have seen this past year it’s no surprise.

Other physical changes in the GeForce4 chip compared to the GeForce3 line are in memory technology, antialiasing, and the addition of a second vertex shader unit. The additional vertex shader won’t benefit a a simulation like IL-2 Sturmovik much, but two units will improve performance in future games that use even more complex lighting and alpha effects (like smoke and clouds and transparency). As new simulations make greater use of DX 8 calls, we’ll see an additional benefit in performance with the GeForce4.

A Bf 109 fights for its life in 'IL-2 Sturmovik'

The GeForce4 is more than 25 percent more efficient at ignoring textures that will be masked (occluded) in the final scene. I can’t help but wonder if NVIDIA wasn’t impressed by the KYRO II’s use of occlusion technology. It’s a natural to see an improvement in this area—why spend energy on producing pixels that won’t appear in the final scene? Any improvements in efficiency translate powerfully into frame rate improvements at this level.

“Work smarter, not harder.” Improvements in efficiency are also met by improved caching technology.

Antialiasing has never quite lived up to its promise with NVIDIA. Where 3dfx seemed to get it right from the start, it’s been a struggle with NVIDIA hardware. Typically the performance penalty has been 25-30 percent, but that was only with very basic antialiasing. 4X antialiasing, where scenes start to really look smooth, has incurred a penalty of up to 40 percent, simply unacceptable in most simulations. With GeForce4 there is some light at the end of the tunnel.

In fact, Quincunx antialiasing, which is equivalent to 4X, can now be performed at the same speed as 2X. This is good news. Accuview is the name for the new antialiasing option. Accuview uses more texture samples per pixel for greater accuracy.

One of my complaints about the GeForce3, in fact, was at this point. When I first enabled antialiasing in IL-2 Sturmovik I was shocked to see the blurry textures on the ground. In order to recover the texture accuracy I had to enable Anisotropic filtering in IL-2 Sturmovik's setup, but the performance penalty was nasty. What to do? I settled for NO anti-aliasing and running at higher resolution.

With GeForce4, antialiasing as a whole has been streamlined, and is much more efficient. The difference in so substantial that I think it is safe to say that only with GeForce4 does antialiasing become a standard for NVIDIA.



Inno3D Tornado GeForce 4 MX



Properties tab in WIN ME

By now most COMBATSIM.COM readers know that the GeForce4 MX is a category all its own, neither fish nor fowl. The chip is not a GeForce4 per se, and neither is it a GeForce3 per se. So what is it then?

GeForce4 MX sports the same core as the GeForce4, and even the same advanced memory technology, but it lacks the programmable pixel shaders. Then why call it GeForce4 at all? Isn’t this misleading?

While some will argue this is a marketing decision, the core of the GeForce4 MX is indeed the same as the Ti4600 with the exception of the missing pixel shaders and a limited vertex shader. It does share in much of the evolved technology, and so…GeForce4 MX.

The GeForce4 MX line will consist of at least three variants: the MX 460, the MX 440 and the MX 420. All three boards will feature a 128-bit memory bus with a 64MB frame buffer, but only the 460 and 440 will use DDR SDRAM. The MX 420 features conventional SDR SDRAM.

The GeForce4 MX 460 will be priced around $179 and will sport a 300MHz GPU clock and a 275MHz memory clock. The MX 440 is clocked internally at 270MHz with a 200MHz memory clock, priced around $149. The entry-level GeForce4 MX 420 will be clocked at 250MHz core with 166MHz memory and should retail around $99.

Inno3D’s Tornado GeForce4 MX board looks like the reference design. It is packed with 64MB of memory, in the form of four 16MB Samsung memory chips.

The board sports a standard VGA connection and an S-Video connection, but the DVI connector is relegated to the MX 460. If you want to connect a LCD monitor you’ll have to purchase the more expensive version.

For the business user, the drivers allow access to the new nView features. You can set up multiple displays, but you can also configure a variety of profiles for various video modes. You can also configure hot keys to access modes on the fly, or alter the opacity of any display window you set up.

Installation was easy. One of the nice things about moving from one NVIDIA board to another is that you may not even have to install new drivers. Moving from GeForce2 MX on my backup system to the Inno3D Tornado involved swapping out the board and rebooting. That was it!

If I had had to use the included driver CD, it came in a nice triple package. Inno3D’s MegaPack bundle was included, consisting of the installation CD, WinDVD 2000 (version 3.1) and PowerDirector SE for video editing.

While I didn’t test PowerDirector, since I don’t own a video camcorder, I did install and run WinDVD. This version didn’t give me any of the pauses of the earlier version and I was able to enjoy Pearl Harbor, the movie, to the max. In fact, there is a nifty new feature in WinDVD that allows selection of a portion of the 16:9 image, effectively cropping the display edges to nearly fill the display.



Tornado GeForce4 MX440 Performance

Performance wise, the GeForce4 MX is well beyond the GeForce2 MX. Video hardware based on this chip will run current games very well, while future games that take advantage of DirectX 8 features will suffer. For the budget minded the GeForce4 MX series will be attractive, and we will undoubtedly see the chip appearing as an integrated part on many mainboards.

How does GeForce4 MX compare to GeForce2MX in current games? The performance improvement averages from 20 to 40 percent. In IL-2 Sturmovik, with graphics options on HIGH (as opposed to EXCELLENT) on a 1 GHz DURON system with 512MB DDRAM, the GeForce4 MX scored 22 FPS in FRAPS as opposed to 17 FPS for the GeForce2 MX (1024x768 at 32-bit)

Of course, frame rate doesn’t tell the whole story. Image quality is slightly improved, and antialiasing becomes possible on the GeForce4 MX, where it is simply not possible on GeForce2 MX. Furthermore, there were noticeable pauses on the GeForce2 MX, where they simply vanished on the GeForce4 MX. I’m not certain whether this is due to the larger memory allocation, but I suspect it is a combination of increased memory bandwidth and improved texture processing.

The driver version and all IL-2 Sturmovik settings were identical for all tests.

3DMark Chart

3DMark 2001 scores were almost predictable. The GeForce4 MX improves over GeForce2 MX by roughly 66 percent.

GeForce4 MX performance is roughly equivalent to GeForce2 GTS. Not bad, but then you’ll pay about the same for the GeForce4 MX anyway and GeForce2 is no longer in production.

One bright spot is that the MX 440 has some headroom. I was able to overclock the main memory by a whopping 15 percent without loss of stability. With DDRAM memory at 230 the effective throughput is 470MHz. Actual performance in 3DMark 2001 increased to 3550.

Where GeForce2 MX is memory and fill rate limited, GeForce4 MX is CPU limited and it scales much better. Increasing the CPU power to a 1.6 GHz AMD Athlon system the 3DMark hit 5800. Furthermore, the 3DMark score remains the same whether the test is in 16-bit color or 32-bit color. GeForce2 MX simply couldn’t match the performance.

3DMark Data

As for compatibility, anything you can run on GeForce2 or GeForce3 will run on GeForce4. I tried quite a slew of games and had no problems with any of them, using the provided 27.40 drivers.

I tested these games:
  • Combat Flight Simulator II
  • Destroyer Command
  • European Air War
  • Falcon 4
  • Flanker 2.5
  • Flight Simulator 2002
  • IL-2 Sturmovik
  • Silent Hunter II


The Purchase Dilemma

It’s not hard to recommend the Tornado GeForce4 MX440 to gamers on a budget. The board performs well and even has some overclocking headroom. What is tougher to estimate is the potential impact on developers.

The GeForce4 MX does not support the advanced features of DX 8. This means that if you purchase the Tornado GeForce4 MX440, you are not encouraging developers who are looking at investing time and money at supporting more advanced processing features.

At the same time, for $149 you may be able to pick up an ATI board that does support the programmable features of DX 8. If you have the choice between the Radeon 8500 LE (with 128MB) and the GeForce4 MX at a similar price, your selection becomes an easy matter.

There will be a third option, however, in the entry level GeForce4 market. The GeForce4 Ti4200 is likely to retail around $199. If you aren’t cash strapped, this would be the best purchase you could make. But for the budget upgrade, GeForce4 MX can carry you until you can afford the more pricey products.



Graphic Card Resources

Articles:
Companies:


 Printer Friendly

© 2014 COMBATSIM.COM - All Rights Reserved