Virge vx driver




















You might have noticed that using one bit for telling the chip if the data is Z or color value means that the color and depth precision is decreased to 15 bits per pixel. That is true. I am not sure about the Z-buffer, but colors look the same with and without MUX buffering. Given the heavy dithering in high-color modes on ViRGE, I assume that 15 bits per pixel are used for color data by default thus, each color component is stored in 5 bits.

I am among those who somehow like the high-color dithering on ViRGE way more than dithering on ATI cards and it is a part of my s gaming experience.

The low internal color precision leads to interesting but annoying alpha-blending artifacts, where alpha-masked polygons have dithering even on pixels that should be fully transparent. Tire smoke behind the vehicles shows ugliness of the ordered grid dithering artifacts if too many alpha-blended polygons are drawn over each other Rollcage, Using it in true-color modes allows you to avoid the typical dithering artifacts and enjoy better picture quality.

This sounds strange at first, but it makes sense. ViRGE is very inefficient in memory accesses during texturing as it needs to jump in the memory a lot and it always reads just small bits of data. The memory chips spend most of the time accessing the first word from a new location, but all subsequent words are typically read with the speed of one word per chip cycle.

Also, the required bandwidth is not doubled as only the color buffer is larger. There is in fact one operating system that has support for OpenGL hardware acceleration on any ViRGE card and the driver is even bundled with the operating system. The DDK contains the whole source code of the driver excluding the mini-port part , so any hardware vendor can see how the features are implemented.

You can modify the driver and compile it by yourself. To build the driver s3mvirge. There are also a few other caveats before you get a fully working driver in a distributable form. For example, there is no. The easier way is to combine the new dll file with any existing ViRGE driver. Only the polygons with certain unsupported blending features are rendered using CPU and then sent to framebuffer memory these polygons are drawn slower but without any graphics glitches.

The driver uses OpenGL hardware acceleration only if the desktop color depth is set to bit high-color. I found only one issue — the driver switches to the software-only mode after any color depth or resolution change happens and you need to restart the system to get the hardware acceleration back. I still remember all those on-line discussions with people looking for an affordable OpenGL accelerator.

In the first quarter of , you could buy a Matrox Millennium 1 card that supported OpenGL MCD but could not accelerate textures nor transparency effects. Its dithering artifacts on alpha-blended polygons were worse than what ViRGE produced and the bilinear texture filtering was one of the worst you could see. ViRGE has never been a good choice for GLQuake but it could have been a good cheap speed-up for 3D modeling programs in the early days of 3D acceleration.

Simple smooth-shaded triangles are drawn with the speed similar to alpha-blended ones so the speed is halved. I assume that the MCD driver does not have that good memory management and large textures are sometimes drawn using CPU. The high driver overhead causes the system to handle only about 30 thousand draw calls with MCD compared to thousand draw calls with ICD. I assume that Microsoft did not spend much time optimizing the driver and preferred it to be easy to read for other programmers. Very impressive article!

Well done! Have you looked at the S3 Trio 3D and 2X cards also? To the best of my understanding they are also based on the Virge but offer some real improvements beyond being AGP, there are two distinct variants as I remember. Anyway thank you for a very comprehensive look at these mostly ill reported cards.

It has a fantastic clean output quality and seems to be one of the best of the first Virge cards available. The output quality is superb on these. I had troubles only with my GX2, but I believe that it just needed to replace caps and the trick with changing the black level voltage using its registers. I even have one and tested it in multiple games. From my point of view, all the ViRGE cards had some real advantages when released — even the GX2 was usable for low-end 3D gaming and was pretty good in multimedia.

On the other side, the Trio3D was all about the low price. Amazing article. It is good to see those articles. Anyway, you forget one thing: the bad manufacturers. They did so much damage to S3. I think they were the responsible for the bad fame of those cards. You are right. Bad manufacturers caused a lot of harm not only to S3. Thankfully, Virge required to work with bit data bus. This explains why the experience with the low-end cards varied so much among users. They were called decelerators mostly because they were bundled with computers for too long.

ViRGE was showing its age in late 97, but budget computer integrators would still sell them as the low-cost option in 98 and I just found a budget computer test in a Polish magazine from April and there were still configuration with a ViRGE. It was already obsolete in The new 3D accelerators that came in the middle of were all much better. ViRGE was on the market for very long. On the other side, aside 3D, it was a good chip.

I think people like my father never used any 3D feature of his computer until Windows Vista found its way to his computer and had 3D accelerated desktop compositor. I love everybody, who care about this topic, everybody, who effort to clear reputation.

I allways scope, what does means to own 3Dfx nowaday VS at the time when card was released. Tip: For Windows 98, use S3 drivers over the default ones that come with Windows. However, S3 made significant updates to the 3D engine - perspection correction is separated so it no longer causes unncessary clocks on the pipeline, and they implemented a new texture filter which was able to sample from different mip-maps.

Despite being their flagship product, poor software drivers and poor performance compared to the competition meant this card was a commercial failure. This was aimed at the mobile laptop market, although some MX chips found their way onto other vendors cards, presumably because it was a cheaper variant of the GX2 thanks to its lower clock speed.

Unsure if S3 released a card of their own with this, but the Trio 32 chipset can be found on the Diamond Stealth SE graphics card. Chipset: 86C Memory:? The Savage 4 LT "Lite"? There was also an "Extreme" that got a core clock running at MHz. The Savage4 can be thought of as a bug-fixed Savage3D - good specs on paper, but turned out to be a commercial failure.

It was considered Diamond's budget offering at its launch in late after Diamond's takeover of S3. The Savage3D was launched in Chipset: Memory:? PMI file. PMI file has not been created, the adapter will not be supported. This process can take several seconds and will black-out the screen. When the installation is complete, shut down your system, remove the diskette, and restart your computer. When the system has restarted: a.

Open the System Setup Folder. Open the System object. If your monitor has not been detected as DDC compatible, on page 2 of Screen Tab, select your display from the display list. If your display does not appear in the list, select Default. Restart your computer to ensure all refresh rate options are available.

Open the System object and select page 1 of Screen Tab, as in step 7. Select the desired screen resolution and a screen refresh rate. Close the Settings notebook or System Properties. CFG file and reboot. This will set the refresh rate to the default value. A new refresh rate can then be selected using the procedure described above. The contents of these files might be useful if you need to report an installation problem to IBM. Not re-installing the driver might result in serious system problems.

For additional information see section 9. NOTE: The following information is meant as a guide. Your LCU command file might be different. CFG file with 75Hz as the default at x resolution. CFG file should be deleted first. Select the radio button next to the desired font size. A dialog box appears with the message that the new settings will take effect the next time you restart your Windows session.

Current open sessions will not be affected by the new settings. Click OK. Selected settings remain in effect until explicitly changed, or the display resolution or driver is changed. EXE loading in the background. Ensure that no other processes are running during CID install. The workaround is as follows: a Create a master cfg file using a system with the same graphics card and monitor configured with the S3 driver, the correct display type, and the desired resolution and refresh rate.

The client system will be configured with the correct driver, display type, resolution and refresh rate. A context sensitive menu will appear. If not, select the desired radio button. Therefore, it is recommended that you reset the altered settings to their defaults first. A context sensitive menu is displayed. A DOS Window or Full Screen message is displayed stating that this session may have an active program and asking if it should be closed without saving your data.

End of Document. Important: o "S3 DRV1" must be the label on the installation diskette in this package. The 32 bpp device driver might need a large swap space on the hard disk, for example, 15 MB. This device driver automatically detects DDC monitor capabilities and sets the display to the maximum refresh rate supported by the monitor.

Use the following procedures when installing a DDC monitor. Reset the display to the lowest refresh rate available. The device driver diskette must be labeled "S3 DRV1. It may also be used prior to installation to verify the current driver version, however, some previous device drivers did not provide this information.



0コメント

  • 1000 / 1000