Jump to content

Fubar512

MODERATOR
  • Posts

    8,418
  • Joined

  • Last visited

  • Days Won

    42

Everything posted by Fubar512

  1. Read here, Misty FAC: http://www.simhq.com/simhq3/sims/boards/bb...7209;p=2#000063
  2. Update: I treated the Old Dog to aother video card upgrade, a GeForce 6600GT AGP, this past Monday, ran several benchmarks, and then tried WOV, for comparison. The average Frame Rate went up by 2-3 FPS, and the low Frame Rate went up fractionally (no more than 1 fps). Afterwords, I shut the system down, and took a nice four day vacation to upstate NY. Upon my return last evening, I hit the power switch on the Old Dog, saw the system get as far as the video check during post...heard a light pop through the speakers, and then, everything went dark. After spending hours trying different processors, video adapters, etc. I realized that the Old Dog had died, its VRM has probably fried as a result of the high CPU frequency (which it was never designed to handle), and the abuse that it's endured over the course of the last four plus years. I've managed to get a hold of a somewhat mediocre NForce 2 U400 board to get me through for a while (an Abit NF7-S2), and had to finally give in and pin-mod the XP-M to get it to work at a decent frequency. This "setback" at least allowed me to compare the performance of the two systems back to back., So, just for grins & giggles, here are my findings: XP-M 3000+ @ 2366 MHz (145 MHz FSB & SDRAM speed), Geforce 6600 GT AGP WoV average FPS @ 1600x1200 x2 FSAA & 2x SF = 35, low 17/high 76 3D Mark 2001 SE = 10984 @ default, no aa or af, "quality" image setting XP-M 3000+ @ 2400 MHz (200 MHz FSB & DDR RAM speed), Geforce 6600 GT AGP WoV average FPS @ 1600x1200 x2 FSAA & 2x SF = 38, low 18/high 76 3D Mark 2001 SE = 17362 @ default, no aa or af, "quality" image setting As you can see, the new board allowed me to ramp up the CPU's clock frequency up slightly, to 2400 (from 2366), and of course, bestowed upon it the advantages of a higher FSB, and the greater memory bandwidth of PC3200 DDR RAM. All that, made a only a slight difference in WoV, proving that it's much more CPU intensive, than anything else. 3DMark, on the other hand, saw a HUGE jump in score, proving that you cannot always trust a 3D benchmark when it comes determining how well a system will run a flight sim!
  3. Deuces isn't kidding when he says that it's tedious. i've tried it myself, and found it an absolutely mind-numbing process.
  4. If you mean by clicking on the "create a mission" button, I don't think so. You can, however, write your own mission, where you have a carrier that you can land and take off from, and then load it into WoV.
  5. There's one more thing to try before you call it quits... If you have an antivirus program, make sure it's updated, and run it. Also, did you make any restore points in XP before you started having this problem?
  6. How about trying this: Manually delete the the WoV folder and it's contents from windows. Then download and install CCleaner ( http://www.ccleaner.com/ ). Start CCleaner, hit the "issues" tab, then hit the "scan for issues" button. If and when it finds any issues, select them all, and fix them. Keep repeating the process over and over until CCleaner reports that it cannot find any more issues. Then, repeat the process on "applications". When you're finished, restart windows, and try to install WoV. I hope this works for you.
  7. I built a high-end system for a friend, in January of 1999, and at his request, used the largest hard drive available at the time (17 GB). I recall telling a coworker about all the goodies that were going into this monster, and his response was something to the effect of "A 17 Gig harddrive! What the hell is he going to do with all that storage space?" Nowadays, my collection of patches, updates, and addons for the half- dozen sims I play wouldn't fit on so small a drive!
  8. 1 Gigabyte of video RAM? Damn! Here's a link to the article at Tom's hardware: LINK
  9. White shadows, or "ghosting", has been reported by a number of 6600 series owners in IL-2/PF, and appears to be driver-related, so the solution may be as simple as trying a different driver. Another 6600 user reported that ghosting occured only when he set the image sharpening slider too far to the right. Another solution, is to simply turn the shadows off in the conf.ini file.
  10. Here's a series of images from WoV. It's a dogfight between myself in an F-8D, vs an AI MiG-17F, it lasted almost ten minutes, and went from the initial encounter at 14,000 feet on down to tree top level. You can see both the MiG and the Crusader engaged in a scissors. I finally got the MiG at low altitude, and fed him some 20 mike mike
  11. In as far as using AiM-7s, AiM-9s, AiM54s, AA-2s, etc., there is whats known as a "sticky thread" by USAFMTL at the top of the forum, regarding exactly this topic. Just to make it easy for you: http://forum.combatace.com/index.php?showtopic=8787 The SAMs are Surface to Air Missiles, and you have no control over them whatsoever in game. They're fired by the simulation's "AI", or, Artificial Intelligence. Strike Fighters and Wings over Vietnam both contain Soviet bloc SAMs and their support systems (SAM Radars, Early Warning Radars, etc.). There are some addon third party SAMs available as well, with their attendant launch platforms. There are even ship-launched SAMs that you can add, in case you're interested. TIp: You will know if you're targeted by a SAM if you are in an aircraft that has an RWR, or Radar Warning Reciever. You will also be warned by your wingman. Which brings up another point... When you recieve a missile or SAM launch warning, hit the "R" key to target the threat, and the "F4" key to view it in cockpit, or the "F8" key for an external view (your plane to the selected threat). Hope this helps...
  12. Not yet, I'm still playing with it and awaiting "further" developments from Third Wire.
  13. just in case anyone's interested, there are new nVidia drivers available, the 77.13 release. I've tested them on my GeForce 4, and seemed to have netted a nice 2-3 fps improvement in WoV. The IQ (image quality) is pretty good on these, as well. USAFMTL reports a similar improvement in image quality on his GeForce 6800 GT.
  14. I was recently flying a "silver swallow" (Mig-17A) for the VPAF, when I encountered an inbound strike package of F-105s escorted by F-4Cs. As I started my attack on the Thuds, an F-4 came up behind me, forcing me to break off . The next thing I know, I'm involved in a really hairy battle with this Phantom and one of his wingmen. During the course of said battle, the nearest F-4 engaged me in a rolling scissors maneuver. The long and short of it is....that was absolutely the best dogfight I've had in any flight sim in a dog's age! This is the first time I've observed the AI fighting this way in either SF or WoV, and I attribute this to changes Incorporated into the latest patch. I'm also running the "Insane Edition" of the Uber AI
  15. USAFMTL and I just spoke about this a few days ago.....that is, the fact that this forum has been slow, and needed just a bit of TLC (or a swift kick in the arse) to get it going. How about a "Windows XP Tweaks" thread, something we can all contribute to, and hopefully, learn from?
  16. Nah, the 2400+ maxed out at just under 2300 MHZ, a speed which required a core voltage 1.85V to post, which in turn necessitated what my friends referred to as "the screaming black levitation device" to keep it cool. It also returned errors in Pi, which spoke volumes as to its overclocking potential. The XP-M, by contrast (and despite it being a 1.65v part), runs 2400 MHZ at the default voltage, and idles at 37C. It also returns no errors in Pi. I'm willing to bet that it's good for 2500+ with just a little bump in voltage. BTW, I'm running a Thermalright SK-7, with an 80mm Antec smart fan (40 CFM max). Yes, I can actually HEAR things in my room, and hold normal conversations...
  17. Buff, Take another look at the fifth post on this thread.....that's OK, it happens to me all the time, too. :) You'd have to cut the L6 Bridges (or pin mod it) to get anything over 16.5x on this proc. CPUMSR only adjusts the multiplier to the maximum that the CPU is physically set for. 2700 MHZ? That's a helluva an overclock...that should translate to an XP rating of about 4200+ !!!
  18. Wow, The original "Fail-Safe" was on the tube just a few days ago....and now, what do I come here and see?
  19. Are any of you experiencing micro-stutters in some games? Such as LOMAC or any of the IL-2 series? If so, there's a possible cure, one that I can honestly say worked for me: Adjusting your video card's latency value. I'm sure that some of you may already known about this for a while (right, Buff?), but for the benefit of those that did not, here's a link to a thread at Guru3D detailing the problem, and the cure: AGP/PCI LATENCY After reading the thread (and downloading and installing Guru3D's PCI Latency Tool as per the directions provided), I was quite surprised to not only find the expected latency value of 248 assigned to my GeForce 4, but also saw that my Promise ATA adapter was hogging the PCI bus as well, with its value set to 240! I experimented with AGP values between 64 and 128, and set the values of the other PCI devices at 32. The results? No more micro-stutters in IL-2 or LOMAC! I strongly suggest that you give this a try.
  20. Raptor, going from a GeForce 3 Ti500 to a Geforce 4 Ti4800, with double the video RAM plus higher-clocked GPU and memory speeds (300x650 vs 260x500) gained me a whopping 900 more points in 3D Mark! So, going to a 6800GT or even a 6600GT, would be a waste on this system, other than giving me DX 9 capability. The very fact that the minimum fps in WoV went up more than the maximum fps with the additional 200 MHZ CPU speed proves that I'm CPU and memory bus-limited. Buff, I may wind up pin modding this beast afterall, since I've managed to crank the FSB up to to 145 MHZ, yielding 2400+ MHZ at the default voltage, with no issues.
  21. If you look at the first graph, you'll see that it actually surpasses the 3.2 P4 in sheer computing (number crunching) capability, The lower (whetstone) graph, is more an indication of instruction-set optimizations (SSE2, etc), and memory bandwidth. In the case of, say, a socket 939 Athlon 64, it buries the current P4 to the point that the once controversial "P" rating, is now considered too conservative! Anyway... Before we move on, here are the system specs: Abit KT-7A RAID, rev 1.3, KT7S_B4 BIOS. Athlon XP-M 3000+ CPU @ 2318 MHz (140 MHz FSB & Memory bus) nVidia Ge Force 4 Ti4800 with 63.72 Omega drivers, Turtle Beach Santa Cruz sound card, 3Com PCI aDSL modem, 1024 megs of Crucial PC133 CAS2 memory (mem timing at 2-2-2-5) The next benchmark I subjected the Old Dog to was meant to measure the performance of it's memory sub-system. I knew that the KT-7A's SDRAM was in no way competitive with a newer system running even PC2100 (266 DDR RAM), let alone PC3200. But, nonetheless, a benchmark is still a benchmark, right? Using a Aida32's (now Everest) software's memory bench, I obtained a memory read score of 1064 mb/s (megabytes per second). It was at the top of the chart...as far as an SDRAM based-system was concerned. That score was, however, truly pathetic when compared to ANY DDR system, as the lowest score they provided for comparison was over 1500 mb/s for a VIA KT266 based board running PC2100. An NForce2 based board running DDR3200 at a 200MHZ FSB and Memory bus (which would have been a much better system for this CPU) averages out at 2790 mb/s. a P4 system running an Intel i875P chipset w/PC 3200 averages out at 4880. A socket 939 Athlon 64 running PC3200 in dual-channel mode with it's integral memory controller, benchmarks at almost 6000 mb/s. The next benchmark was 3DMark2001SE. Now, I don't go out of my way to get a high score by doing anything other then switching off Anti Aliasing and Anisotropic Filtering. The other options (image quality, etc.) were left on high, pretty much as I'd leave them when running Strike Fighters or LOMAC. I also do not disable sound, nor did I overclock the video card for the base scores. The base score for the old Xp2400+ CPU, at 1024x768x32bb was 9380. With the XP-M 3000+ in place, the score went up just a skosh over 10%, to 10364. A year ago, I built a system for a friend based around an Abit NF-7, with an XP2800+ running both the FSB and memory at 333 MHz, and using the same make and model video card that's presently in the Old Dog. It benchmarked at just over 12000 in 3dMark2001, just to give you a basis for comparison. So, was it worth it? After running WoV with all eye candy enabled (with the exception of shadows), I'd have to say.....a big "maybe". Overall, it seems to run smoother, no doubt due to the additional 200 MHz core speed and the larger level 2 cache (512 kb as opposed to 256 kb on the original XP2400+). However, it still falls short of the performance one would expect, of say, an NForce2 based system. Maximum frame rates in WoV are up perhaps 3-5%, but the minimum fps rates now seem to stay within the "playable" range (just over 20 fps in cockpit @ 1600x1200x32 2xAA 2xAF) with most of the eye-candy switched on, something that I could not honestly say was possible before. So, perhaps the Old Dog will be around for a bit longer....but I'm starting to get that itch again....and Athlon 64 prices are coming down
  22. BUFF, As it turned out, I believe that I've made the right choice as far as this on this particular chip set and motherboard are concerned, as the lower rated XP-Ms would have required a pin mod to reach the necessary multiplier setting . Now, back to the story. I started CPUMSR, clicked on the Frequency Control and Voltage tab, dialed up the proc's default multiplier (which, short of a pin-mod, was also its maximum). I hit the "set" button, and.....instant freeze-up! It seems that there was just one more step to take before all this could work, and in my haste, I'd overlooked it. In order for the throttling command to work, a necessary bit of code in the system's BIOS had to be set at he right value. This could not be accomplished through the CMOS settings (unless I compiled a new BIOS, or edited the existing one to allow for it, but lets not get ahead of ourselves). A bit of research verified the fact that this software and the XP-M did indeed work on this chip set, but a bit of hacking was required. I had to set "bit 2, in register 55, to a value of 1", a task best accomplished with WPCREDIT and WPCRSET. I knew from previous experience dealing with compatibility issues between VIA chip sets and Creative Sound Blaster Live! audio cards, that these two apps (both works of H.Oda), were a must have for any self-respecting geeks bag of tricks. WPCREDIT is used to find, and define the Hex values for the necessary register change, and WPCRSET is used to make the change "permanent" (at least in Windows registry). I followed the directions, found that they yielded a register value of "0D", set that in WPCREDIT, opened up CPUMSR yet again, dialed up the "boost", hit set, and.......................... SUCCESS! I immediately entered the necessary values into WPCRSET, restarted Windows, and verified that they'd taken. Everything was working as it should, so it was now time to perform a few benchmarks. How did this processor, handicapped as it was by the KT-7A's antediluvian SDRAM, stack up against the same CPU in a "modern" system, which would be running at 400 MHZ, with its PC3200 in dual channel mode? I fired up SiSoft Sandra 2005, and decided to find out... First, the CPU Arithmetic Benchmark (I don't know about you, but I hated math with a passion when I was in school...) The graph at the top are the Old Dog's scores....whoa! Not to bad. Maybe it is possible to teach an old doh new tricks......
  23. When the new finally CPU arrived, I immediately set about removing the Thoroughbred "B" XP2400+ from my system, and replacing it with the XP-M 3000+. I cleared the CMOS, set the multiplier to "x13 and above", set the Voltage to 1.65, and then crossed my fingers as I powered the system up. I was greeted with the message "unknown CPU at 1667 MHz". What the hell? Now, I fully expected the "unknown CPU' message, as my KT-7A lacked a BIOS with the proper microcode to identify the new CPU, but was rather dismayed to see that it did not assign it a multiplier higher than 12.5x. Previously, this same motherboard had no problem assigning a multiplier of 15x to the XP-M's predecessor. I let the system boot into windows, and set about trying to find out what could be done in regards to the multiplier issue. As it turns out, I discovered that I had had several choices. Pin modding the CPU, pin modding the socket, or, by taking advantage of the XP-M's throttling feature, and using it to set the multiplier through software in Windows. This last one appealed to me most, as I frequently leave the machine on for days, running Folding@Home, and it tends to run quite warm as a result. A CPU that could be throttled down in core speed would obviously run cooler when ramped down to a reasonable speed, and perhaps I could also utilize this feature to provide protection should something catastrophic occur, such as failure of the CPU fan. Performing a Google search yielded results immediately, and I downloaded and installed a freeware app named ?CPUMSR?, written by Petr Koc and Miroslav Tvrz. (in order to use CPUMSR, you?ll also need to download and install Andreas Valisky?s LLA driver). After carefully following the directions, I was ready to begin, or so I thought?.
  24. The KT-7A runs on a 133 MHZ Front Side Bus. AMD calls it a "266 FSB", as it's double-pumped (conversly, Intel's P4 runs on a quad-pumped bus, hence the 800 MHZ P4s are really clocked at 200 MHZ). The fastest Socket A processor that they produce for a 266 FSB, is the XP2400+...but wait, that's not entirely true. AMD produces a line of Socket A Athlon XP mobile processors, which run on a 266 mhz FSB, come equipped with a 512 KB L2 cache, and have a whole slew of features designed to make them laptop friendly (lower power consumption, multiplier throttling, etc). The most powerful of these is the "XP-M 3000+". It runs at a multiplier setting of 16.5, which on a 133 MHZ (266) bus translates to an actual core speed of 2200 MHZ. By way of comparison, the non-mobile (desktop) XP3000+ (recently derated from its original "3200+" rating), runs on a 200 MHZ (400) FSB at 2100 MHZ true core speed. So, I decided to bite the bullet to the tune of $137 US, and ordered an Athlon XP-M 3000+. The question was, would this processor be recognized by, and run on, my old-as-the-hills KT-7A? Stay tuned for the results....
  25. I was recently considering building a new system to replace my aging Abit KT-7A. While it was a solid performer back in it's day, it was starting to show it's age, finding itself incapable of running today's titles at a decent resolution or frame rate. Or, can it? This particular mainboard had scaled from 933 MHz (800 Athlon unlocked at 7@133 fsb), to 1333 MHz, to an XP1900 at 1680 MHz (12x140 fsb), and finally to a 2100 MHz (XP 2400+ @ 2100 MHz w/140 fsb). Graphics-wise, it's gone from its original Voodoo 5500, to a GeForce 2 Ultra, to a GeForce 3 Ti500, and eventually, to a GF4 Ti4800. There are several bottlenecks with this system. First, it's FSB is only capable of a stable 145 MHz, as it has no way of adjusting the PCI or AGP clock. Bottle neck number two, is it's use of SDRAM. Number three, is it's 4X Max AGP speed. While the last one is no biggy (I've yet to see anyone prove that there's any significant performance increase between AGP 4x and 8X), the other two seem serious enough to warrant scrapping the board at first glance. Or, are they? How would this board compare in benchmarks to a modern system? Well, there's only one way to find out......another CPU upgrade. Is it possible? Lets find out....
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use, Privacy Policy, and We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue..