Jump to content

Fubar512

MODERATOR
  • Content count

    8,418
  • Joined

  • Last visited

  • Days Won

    42

Everything posted by Fubar512

  1. If you mean by clicking on the "create a mission" button, I don't think so. You can, however, write your own mission, where you have a carrier that you can land and take off from, and then load it into WoV.
  2. There's one more thing to try before you call it quits... If you have an antivirus program, make sure it's updated, and run it. Also, did you make any restore points in XP before you started having this problem?
  3. How about trying this: Manually delete the the WoV folder and it's contents from windows. Then download and install CCleaner ( http://www.ccleaner.com/ ). Start CCleaner, hit the "issues" tab, then hit the "scan for issues" button. If and when it finds any issues, select them all, and fix them. Keep repeating the process over and over until CCleaner reports that it cannot find any more issues. Then, repeat the process on "applications". When you're finished, restart windows, and try to install WoV. I hope this works for you.
  4. Dual GeForce 6800 features 1 GB of RAM!

    I built a high-end system for a friend, in January of 1999, and at his request, used the largest hard drive available at the time (17 GB). I recall telling a coworker about all the goodies that were going into this monster, and his response was something to the effect of "A 17 Gig harddrive! What the hell is he going to do with all that storage space?" Nowadays, my collection of patches, updates, and addons for the half- dozen sims I play wouldn't fit on so small a drive!
  5. 6600GT help!

    White shadows, or "ghosting", has been reported by a number of 6600 series owners in IL-2/PF, and appears to be driver-related, so the solution may be as simple as trying a different driver. Another 6600 user reported that ghosting occured only when he set the image sharpening slider too far to the right. Another solution, is to simply turn the shadows off in the conf.ini file.
  6. nVidia 77.13 Drivers

    just in case anyone's interested, there are new nVidia drivers available, the 77.13 release. I've tested them on my GeForce 4, and seemed to have netted a nice 2-3 fps improvement in WoV. The IQ (image quality) is pretty good on these, as well. USAFMTL reports a similar improvement in image quality on his GeForce 6800 GT.
  7. I was recently flying a "silver swallow" (Mig-17A) for the VPAF, when I encountered an inbound strike package of F-105s escorted by F-4Cs. As I started my attack on the Thuds, an F-4 came up behind me, forcing me to break off . The next thing I know, I'm involved in a really hairy battle with this Phantom and one of his wingmen. During the course of said battle, the nearest F-4 engaged me in a rolling scissors maneuver. The long and short of it is....that was absolutely the best dogfight I've had in any flight sim in a dog's age! This is the first time I've observed the AI fighting this way in either SF or WoV, and I attribute this to changes Incorporated into the latest patch. I'm also running the "Insane Edition" of the Uber AI
  8. Here's a series of images from WoV. It's a dogfight between myself in an F-8D, vs an AI MiG-17F, it lasted almost ten minutes, and went from the initial encounter at 14,000 feet on down to tree top level. You can see both the MiG and the Crusader engaged in a scissors. I finally got the MiG at low altitude, and fed him some 20 mike mike
  9. In as far as using AiM-7s, AiM-9s, AiM54s, AA-2s, etc., there is whats known as a "sticky thread" by USAFMTL at the top of the forum, regarding exactly this topic. Just to make it easy for you: http://forum.combatace.com/index.php?showtopic=8787 The SAMs are Surface to Air Missiles, and you have no control over them whatsoever in game. They're fired by the simulation's "AI", or, Artificial Intelligence. Strike Fighters and Wings over Vietnam both contain Soviet bloc SAMs and their support systems (SAM Radars, Early Warning Radars, etc.). There are some addon third party SAMs available as well, with their attendant launch platforms. There are even ship-launched SAMs that you can add, in case you're interested. TIp: You will know if you're targeted by a SAM if you are in an aircraft that has an RWR, or Radar Warning Reciever. You will also be warned by your wingman. Which brings up another point... When you recieve a missile or SAM launch warning, hit the "R" key to target the threat, and the "F4" key to view it in cockpit, or the "F8" key for an external view (your plane to the selected threat). Hope this helps...
  10. Not yet, I'm still playing with it and awaiting "further" developments from Third Wire.
  11. Flight of the Old Dog

    USAFMTL and I just spoke about this a few days ago.....that is, the fact that this forum has been slow, and needed just a bit of TLC (or a swift kick in the arse) to get it going. How about a "Windows XP Tweaks" thread, something we can all contribute to, and hopefully, learn from?
  12. Flight of the Old Dog

    Nah, the 2400+ maxed out at just under 2300 MHZ, a speed which required a core voltage 1.85V to post, which in turn necessitated what my friends referred to as "the screaming black levitation device" to keep it cool. It also returned errors in Pi, which spoke volumes as to its overclocking potential. The XP-M, by contrast (and despite it being a 1.65v part), runs 2400 MHZ at the default voltage, and idles at 37C. It also returns no errors in Pi. I'm willing to bet that it's good for 2500+ with just a little bump in voltage. BTW, I'm running a Thermalright SK-7, with an 80mm Antec smart fan (40 CFM max). Yes, I can actually HEAR things in my room, and hold normal conversations...
  13. Flight of the Old Dog

    Buff, Take another look at the fifth post on this thread.....that's OK, it happens to me all the time, too. :) You'd have to cut the L6 Bridges (or pin mod it) to get anything over 16.5x on this proc. CPUMSR only adjusts the multiplier to the maximum that the CPU is physically set for. 2700 MHZ? That's a helluva an overclock...that should translate to an XP rating of about 4200+ !!!
  14. Are any of you experiencing micro-stutters in some games? Such as LOMAC or any of the IL-2 series? If so, there's a possible cure, one that I can honestly say worked for me: Adjusting your video card's latency value. I'm sure that some of you may already known about this for a while (right, Buff?), but for the benefit of those that did not, here's a link to a thread at Guru3D detailing the problem, and the cure: AGP/PCI LATENCY After reading the thread (and downloading and installing Guru3D's PCI Latency Tool as per the directions provided), I was quite surprised to not only find the expected latency value of 248 assigned to my GeForce 4, but also saw that my Promise ATA adapter was hogging the PCI bus as well, with its value set to 240! I experimented with AGP values between 64 and 128, and set the values of the other PCI devices at 32. The results? No more micro-stutters in IL-2 or LOMAC! I strongly suggest that you give this a try.
  15. Wow, The original "Fail-Safe" was on the tube just a few days ago....and now, what do I come here and see?
  16. Flight of the Old Dog

    Raptor, going from a GeForce 3 Ti500 to a Geforce 4 Ti4800, with double the video RAM plus higher-clocked GPU and memory speeds (300x650 vs 260x500) gained me a whopping 900 more points in 3D Mark! So, going to a 6800GT or even a 6600GT, would be a waste on this system, other than giving me DX 9 capability. The very fact that the minimum fps in WoV went up more than the maximum fps with the additional 200 MHZ CPU speed proves that I'm CPU and memory bus-limited. Buff, I may wind up pin modding this beast afterall, since I've managed to crank the FSB up to to 145 MHZ, yielding 2400+ MHZ at the default voltage, with no issues.
  17. Flight of the Old Dog

    If you look at the first graph, you'll see that it actually surpasses the 3.2 P4 in sheer computing (number crunching) capability, The lower (whetstone) graph, is more an indication of instruction-set optimizations (SSE2, etc), and memory bandwidth. In the case of, say, a socket 939 Athlon 64, it buries the current P4 to the point that the once controversial "P" rating, is now considered too conservative! Anyway... Before we move on, here are the system specs: Abit KT-7A RAID, rev 1.3, KT7S_B4 BIOS. Athlon XP-M 3000+ CPU @ 2318 MHz (140 MHz FSB & Memory bus) nVidia Ge Force 4 Ti4800 with 63.72 Omega drivers, Turtle Beach Santa Cruz sound card, 3Com PCI aDSL modem, 1024 megs of Crucial PC133 CAS2 memory (mem timing at 2-2-2-5) The next benchmark I subjected the Old Dog to was meant to measure the performance of it's memory sub-system. I knew that the KT-7A's SDRAM was in no way competitive with a newer system running even PC2100 (266 DDR RAM), let alone PC3200. But, nonetheless, a benchmark is still a benchmark, right? Using a Aida32's (now Everest) software's memory bench, I obtained a memory read score of 1064 mb/s (megabytes per second). It was at the top of the chart...as far as an SDRAM based-system was concerned. That score was, however, truly pathetic when compared to ANY DDR system, as the lowest score they provided for comparison was over 1500 mb/s for a VIA KT266 based board running PC2100. An NForce2 based board running DDR3200 at a 200MHZ FSB and Memory bus (which would have been a much better system for this CPU) averages out at 2790 mb/s. a P4 system running an Intel i875P chipset w/PC 3200 averages out at 4880. A socket 939 Athlon 64 running PC3200 in dual-channel mode with it's integral memory controller, benchmarks at almost 6000 mb/s. The next benchmark was 3DMark2001SE. Now, I don't go out of my way to get a high score by doing anything other then switching off Anti Aliasing and Anisotropic Filtering. The other options (image quality, etc.) were left on high, pretty much as I'd leave them when running Strike Fighters or LOMAC. I also do not disable sound, nor did I overclock the video card for the base scores. The base score for the old Xp2400+ CPU, at 1024x768x32bb was 9380. With the XP-M 3000+ in place, the score went up just a skosh over 10%, to 10364. A year ago, I built a system for a friend based around an Abit NF-7, with an XP2800+ running both the FSB and memory at 333 MHz, and using the same make and model video card that's presently in the Old Dog. It benchmarked at just over 12000 in 3dMark2001, just to give you a basis for comparison. So, was it worth it? After running WoV with all eye candy enabled (with the exception of shadows), I'd have to say.....a big "maybe". Overall, it seems to run smoother, no doubt due to the additional 200 MHz core speed and the larger level 2 cache (512 kb as opposed to 256 kb on the original XP2400+). However, it still falls short of the performance one would expect, of say, an NForce2 based system. Maximum frame rates in WoV are up perhaps 3-5%, but the minimum fps rates now seem to stay within the "playable" range (just over 20 fps in cockpit @ 1600x1200x32 2xAA 2xAF) with most of the eye-candy switched on, something that I could not honestly say was possible before. So, perhaps the Old Dog will be around for a bit longer....but I'm starting to get that itch again....and Athlon 64 prices are coming down
  18. Flight of the Old Dog

    BUFF, As it turned out, I believe that I've made the right choice as far as this on this particular chip set and motherboard are concerned, as the lower rated XP-Ms would have required a pin mod to reach the necessary multiplier setting . Now, back to the story. I started CPUMSR, clicked on the Frequency Control and Voltage tab, dialed up the proc's default multiplier (which, short of a pin-mod, was also its maximum). I hit the "set" button, and.....instant freeze-up! It seems that there was just one more step to take before all this could work, and in my haste, I'd overlooked it. In order for the throttling command to work, a necessary bit of code in the system's BIOS had to be set at he right value. This could not be accomplished through the CMOS settings (unless I compiled a new BIOS, or edited the existing one to allow for it, but lets not get ahead of ourselves). A bit of research verified the fact that this software and the XP-M did indeed work on this chip set, but a bit of hacking was required. I had to set "bit 2, in register 55, to a value of 1", a task best accomplished with WPCREDIT and WPCRSET. I knew from previous experience dealing with compatibility issues between VIA chip sets and Creative Sound Blaster Live! audio cards, that these two apps (both works of H.Oda), were a must have for any self-respecting geeks bag of tricks. WPCREDIT is used to find, and define the Hex values for the necessary register change, and WPCRSET is used to make the change "permanent" (at least in Windows registry). I followed the directions, found that they yielded a register value of "0D", set that in WPCREDIT, opened up CPUMSR yet again, dialed up the "boost", hit set, and.......................... SUCCESS! I immediately entered the necessary values into WPCRSET, restarted Windows, and verified that they'd taken. Everything was working as it should, so it was now time to perform a few benchmarks. How did this processor, handicapped as it was by the KT-7A's antediluvian SDRAM, stack up against the same CPU in a "modern" system, which would be running at 400 MHZ, with its PC3200 in dual channel mode? I fired up SiSoft Sandra 2005, and decided to find out... First, the CPU Arithmetic Benchmark (I don't know about you, but I hated math with a passion when I was in school...) The graph at the top are the Old Dog's scores....whoa! Not to bad. Maybe it is possible to teach an old doh new tricks......
  19. Flight of the Old Dog

    When the new finally CPU arrived, I immediately set about removing the Thoroughbred "B" XP2400+ from my system, and replacing it with the XP-M 3000+. I cleared the CMOS, set the multiplier to "x13 and above", set the Voltage to 1.65, and then crossed my fingers as I powered the system up. I was greeted with the message "unknown CPU at 1667 MHz". What the hell? Now, I fully expected the "unknown CPU' message, as my KT-7A lacked a BIOS with the proper microcode to identify the new CPU, but was rather dismayed to see that it did not assign it a multiplier higher than 12.5x. Previously, this same motherboard had no problem assigning a multiplier of 15x to the XP-M's predecessor. I let the system boot into windows, and set about trying to find out what could be done in regards to the multiplier issue. As it turns out, I discovered that I had had several choices. Pin modding the CPU, pin modding the socket, or, by taking advantage of the XP-M's throttling feature, and using it to set the multiplier through software in Windows. This last one appealed to me most, as I frequently leave the machine on for days, running Folding@Home, and it tends to run quite warm as a result. A CPU that could be throttled down in core speed would obviously run cooler when ramped down to a reasonable speed, and perhaps I could also utilize this feature to provide protection should something catastrophic occur, such as failure of the CPU fan. Performing a Google search yielded results immediately, and I downloaded and installed a freeware app named ?CPUMSR?, written by Petr Koc and Miroslav Tvrz. (in order to use CPUMSR, you?ll also need to download and install Andreas Valisky?s LLA driver). After carefully following the directions, I was ready to begin, or so I thought?.
  20. Flight of the Old Dog

    The KT-7A runs on a 133 MHZ Front Side Bus. AMD calls it a "266 FSB", as it's double-pumped (conversly, Intel's P4 runs on a quad-pumped bus, hence the 800 MHZ P4s are really clocked at 200 MHZ). The fastest Socket A processor that they produce for a 266 FSB, is the XP2400+...but wait, that's not entirely true. AMD produces a line of Socket A Athlon XP mobile processors, which run on a 266 mhz FSB, come equipped with a 512 KB L2 cache, and have a whole slew of features designed to make them laptop friendly (lower power consumption, multiplier throttling, etc). The most powerful of these is the "XP-M 3000+". It runs at a multiplier setting of 16.5, which on a 133 MHZ (266) bus translates to an actual core speed of 2200 MHZ. By way of comparison, the non-mobile (desktop) XP3000+ (recently derated from its original "3200+" rating), runs on a 200 MHZ (400) FSB at 2100 MHZ true core speed. So, I decided to bite the bullet to the tune of $137 US, and ordered an Athlon XP-M 3000+. The question was, would this processor be recognized by, and run on, my old-as-the-hills KT-7A? Stay tuned for the results....
  21. If this is the same person who said they could see the weapons and effects, but not the aircraft itself, then you probably do not have all the lod files in the aircraft's folder. Lods are the actual 3D model files. Without them, you will not have a visible aircraft (or weapon, or ship, or ground object). Make sure that there are files with the .lod extension in the "offending" aircraft's folder.
  22. Ziker, I don't have a lot of experience tinkering with the PGMs in SFP1, as I've been mostly flying WoV for the last few months. I can tell you about the ECM & CMs, though. The ECM works quite well, as you can (occasionally) actually hear an acquisition radar lose lock when you toggle/cycle it, if your A/C of choice is equipped with a RWR. It's jamming strength is set by a value in it's, or the aircraft's, data.ini file. Chaff likewise will cause a missile and or acquisition RADAR to lose lock. It has an RCS parameter whose value can be tinkered with. Flares may work too well at times, usually when an AI opponent uses them to foil an IRM shot that you carefully set up! Again, there is a heat signature parameter with a variable that can be set to mimic realistic (or fantasy!) values. The values are set in the countermeasure.ini file, which resides in the objectdata.cat and must be extracted from there (using Sky Pat's SFP1E extractor, which I believe is available on this site's download page), and placed in the Objects folder. The parameters are: [ChaffData] EffectName=ChaffEffect RadarCrossSection=100.0<----- (measured in square meters per bundle) MaxVisibleDistance=12000 DragFactor=0.07 LifeTime=3.0 Length=0.5 VelocityDeviation=0.3 [FlareData] EffectName=FlareEffect MaxVisibleDistance=12000 HeatSignature=100.0<------ ( µm as value? ) DragFactor=0.05 LifeTime=5.0 VelocityDeviation=0.3 I hope you found this useful!
  23. This is going back almost two years. This was my first effect/tweak, a simple adjustment of the muzzle flash and rate of fire on the MiG-17's 37mm cannon. I also gave it a unique sound (which you can hear over the twin 23mm) and tweaked the tracer size, so you can see both the slower velocity, and higher trajectory, of the larger (37mm) rounds. 1.6 MB Movie
  24. Remember our earlier discussion regarding shrapnel, and how it was implemented only as a visual effect? How it would be nice if we could have active, effective, damage-causing shrapnel? Look what popped up during one of my late-nite dll perusing sessions...
  25. Here's a movie showcasing some of the AAA effects that are now available (or will be in the very near future). You'll see three separate flak burst effects, 37mm, 57mm, and 85-130mm. (yes, the 37mm bursts leave gray-white puffs) ;) I've taken the liberty of using several existing weapon models and editing their respective data ini files to create a M1939 37mm AA gun, a KS-12 85mm AA gun, and a KS-30 130mm AA gun. The gundata.ini has also been edited to create larger tracers for the 37-85mm weapons, as well as adjusting the tracer length, and tracer quantity (from one-in-five to every other round being a tracer on the larger, slower firing weapons). Be forewarned, this is a rather large file (12 mb+). Click Here to see it. Enjoy! BTW...I've disabled the Intruder's engine sound so you can enjoy the "external" sound FX :D
×

Important Information

By using this site, you agree to our Terms of Use, Privacy Policy, and We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue..