Comments Locked

22 Comments

Back to Article

  • Brick88 - Thursday, August 30, 2007 - link

    doesn't anyone feel that AMD is cutting itself short? Yes Intel is their primary competitor but by not producing an igp chipset for intel based processors, they are cutting themselves out of a big market. Intel ships the majority of processors and AMD will need every single stream of revenue to compete with Intel.
  • bunga28 - Wednesday, August 29, 2007 - link

    Charles Dickens would roll over his grave if he saw you comparing these 2 boards by paraphrasing his work.
  • Myrandex - Tuesday, August 28, 2007 - link

    I don't knwo why they would ever put that name on the board. the fact that it is getting beat by a ASRock motherboard in gaming performance is pathetic, since that name is supposed to be all about gaming (no offense to the ASRockers out there, as they aren't bad boards I have more experience with them then fatal1ty's anyways).
  • Etern205 - Tuesday, August 28, 2007 - link

    On the "abit Fatality F-I90HD: Feature Set" page,
    that Abit EQ software interface of a car looks
    familar one of those real models.

    Like this one
    <img>http://img404.imageshack.us/img404/8490/toyotafjhh...">http://img404.imageshack.us/img404/8490/toyotafjhh...

    source:
    http://www.automobilemag.com/new_car_previews/2006...">http://www.automobilemag.com/new_car_previews/2006...
  • strikeback03 - Tuesday, August 28, 2007 - link

    I was thinking Hummer, either way...
  • Etern205 - Tuesday, August 28, 2007 - link

    Not really because the face of a Hummer is different
    than the one from Toyota. The face of a Hummer has
    vertical grill bars, while the Toyota does not.
  • strikeback03 - Wednesday, August 29, 2007 - link

    However the Hummer has the full-width chrome fascia, the Toyota has a part-width sorta satin chrome thing.

    I highly doubt they licensed an image of either, so it can't look exactly like any vehicle. I remember a lawsuit between Jeep and Hummer over the 7 vertical slots in eachother's grilles several years ago.
  • eBauer - Tuesday, August 28, 2007 - link

    Why are the Xpress 1250 systems running tighter timings (4-4-4-12) where the G33 system is running looser timings (5-5-5-12)?
  • strikeback03 - Tuesday, August 28, 2007 - link

    quote:

    All of our boards were able to run 4GB of OCZ HPC Reaper at DDR2-800 speeds on 2.04V or less. Our optimal timings for the two X1250 boards were 4-4-4-12 while we had to run at 5-5-5-15 on the MSI G33M board. The MSI board did not care for CAS4 settings with 4GB installed but the overall memory results are still very competitive. In fact, the Sandra unbuffered scores are around 12% better than our X1250 boards and in a couple of our application benchmarks that rely on memory throughput and low latencies, this advantage will be apparent.


    Top of page 8
  • Mazen - Tuesday, August 28, 2007 - link

    I have a 6000+ (gift) and I am just wondering whether I should go with a 690G or wait for nvidia's upcoming MCP 78. Can't wait for the 690G review... thoughts anyone?
  • Sargo - Tuesday, August 28, 2007 - link

    Nice review but there's no X3100 on Intel G33. http://en.wikipedia.org/wiki/Intel_GMA#GMA_3100">GMA 3100 is based on much older arhitechture. Thus even the new drivers won't help that much.
  • ltcommanderdata - Tuesday, August 28, 2007 - link

    Exactly. The G33 was never intended to replace the G965 chipset, it replaces the 945G chipset and the GMA 950. The G33's IGP is not the GMA X3100 but the GMA 3100 (no "X") and the IGP is virtually identical to the GMA 950 but with higher clock speeds and better video support. The GMA 950, GMA 3000, and GMA 3100 all only have SM2.0 pixel shaders with no vertex shaders and no hardware T&L engine. The G965 and the GMA X3000 remains the top Intel IGP until the launch of the G35 and GMA X3500. I can't believe Anandtech made such an obvious mistake, but I have to admit Intel isn't helping matters with there ever expanding portfolio of IGPs.

    Here's Intel's nice PR chart explaining the different IGPs:

    http://download.intel.com/products/graphics/intel_...">http://download.intel.com/products/graphics/intel_...

    Could you please run a review with the G965 chipset and the GMA X3100 using XP and the latest 14.31 drivers? They are now out of beta and Intel claims full DX9.0c SM3.0 hardware acceleration. I would love to see the GMA X3000 compared with the common GMA 950 (also supported in the 14.31 drivers although it has no VS to activate), the Xpress X1250, the GeForce 6150 or 7050, and some low-end GPUs like the X1300 or HD 2400. A comparison between the 14.31 and previous 14.29 drivers that had no hardware support would also show how much things have increased.
  • JarredWalton - Tuesday, August 28, 2007 - link

    I did look at gaming performance under Vista with a 965GM chipset in the http://www.anandtech.com/mobile/showdoc.aspx?i=306...">PC Club ENP660 review. However, that was also tested under Vista. I would assume that with drivers working in all respects, GMA X3100 performance would probably come close to that of the Radeon Xpress 1250, but when will the drivers truly reach that point? In the end, IGP is still only sufficient for playing with all the details turned down at 1280x800 or lower resolutions, at least in recent titles. Often it can't even do that, and 800x600 might be required. Want to play games at all? Just spend the $120 on something like an 8600 GT.
  • IntelUser2000 - Wednesday, August 29, 2007 - link

    quote:

    I did look at gaming performance under Vista with a 965GM chipset in the PC Club ENP660 review. However, that was also tested under Vista. I would assume that with drivers working in all respects, GMA X3100 performance would probably come close to that of the Radeon Xpress 1250, but when will the drivers truly reach that point? In the end, IGP is still only sufficient for playing with all the details turned down at 1280x800 or lower resolutions, at least in recent titles. Often it can't even do that, and 800x600 might be required. Want to play games at all? Just spend the $120 on something like an 8600 GT.


    It has the drivers at XP.
  • JarredWalton - Wednesday, August 29, 2007 - link

    Unless the XP drivers are somehow 100% faster (or more) than the last Vista drivers I tried, it still doesn't matter. Minimum details in Battlefield 2 at 800x600 got around 20 FPS. It was sort of playable, but nothing to write home about. Half-Life 2 engine stuff is still totally messed up on the chipset; it runs DX9 mode, but it gets <10 FPS regardless of resolution.
  • IntelUser2000 - Wednesday, August 29, 2007 - link

    I get 35-45 fps on the demo Single Player for the first 5 mins at 800x600 min. Didn't check more as its limited.

    E6600
    DG965WH
    14.31 production driver
    2x1GB DDR2-800
    WD360GD Raptor 36GB
    WinXP SP2
  • IntelUser2000 - Tuesday, September 11, 2007 - link

    Jarred, PLEASE PROVIDE THE DETAILS OF THE BENCHMARK/SETTINGS/PATCHES used for BF2 so I can provide equal testing as you have done on the Pt.1 article.

    Like:
    -What version of BF2 used
    -What demos are supposed to be used
    -How do I load up the demos
    -etc
  • R101 - Tuesday, August 28, 2007 - link

    Just for the fun of it, for us to see what can X3100 do with these new betas. I've been looking for that test since those drivers came out, and still nothing.

  • erwos - Tuesday, August 28, 2007 - link

    I'm looking forward to seeing the benchmarks on the G35 motherboards (which I'm sure won't be in this series). The X3500 really does seem to have a promising feature set, at least on paper.
  • Lonyo - Tuesday, August 28, 2007 - link

    quote:

    This is not to say any of the AMD and NVIDIA IGP solutions are that much better; they are in many ways, but without earnest competition from Intel these solutions do just enough to stay ahead of Intel. However, at least these solutions provide a much higher degree of compatibility and performance with most games, video playback, and applications. While running the latest games such as Bioshock or Supreme Commander will require a resolution of 800x600 with medium-low quality settings, at least a user has the chance to play the game until they can afford a better performing video solution.


    quote:

    the R4x0 series fits the bill with its lack of SM3.0 support and use of 24-bit floating point precision. The basic design for the X1250 is taken from the X700, with some modifications. While we would love to see Shader Model 3.0 support (which current Intel hardware claims to be capable of in XP with the latest drivers), developers writing DX9 apps will still be designing for the SM2.0 target which the X1250 meets.



    Bioshock requires SM3.0.
  • Griswold - Wednesday, August 29, 2007 - link

    There are a couple SM2.0 patch projects for bioshock out there. Google for it.
  • mostlyprudent - Tuesday, August 28, 2007 - link

    I am looking forward to the rest of the series.

Log in

Don't have an account? Sign up now