FIX  This - ww2

Talk about anything here.
Evilzen
Posts: 27
Joined: Wed Nov 03, 2004 10:17 pm
Location: NJ

Post by Evilzen »

Well ever since the release of the myth2 1.5 ww2 is dying off cause the older players cant stand playing with what ever you guys did. NO one can play in swamps areas on recon. I have a 128mb ddr geforce and i get 3fps on myth which is a old game. This is terrible. Some players cant even run it. This being why players stop playing. Such bad FPS ruins the game that has kept the community alive. People with exellent vid cards should be able to run this game with no problem. Is there anyway this can be addressed. Its so hard for me to play even in my own host. Only place i get above 50fps is plains area. If anything can be done im sure the community would really appreciate it thank you.
best thing since sliced bread
CIK
Posts: 1127
Joined: Mon Mar 01, 2004 9:08 pm

Post by CIK »

Yes this is something we would very much like to get fixed. The problem is we have been unable to reproduce it on any of our hardware. Also we have had a hardtime finding a player with the problem who wants to actually help us find & fix the problem instead of just swearing at us.
User avatar
TarousZars
Site Admin
Posts: 565
Joined: Wed Mar 31, 2004 9:15 pm
Location: Utah, USA
Contact:

Post by TarousZars »

I'll help you CIK, Just tell me what to do.

My fps drop to almost nothing on my PC with and Nvidia gForce4 w/ 64 megs of RAM. My old iMac w/ 2 megs of RAM handled swamps without a problem. I mostly play coop nowadays, so I haven't worried about it a ton, because i hardly play recon. But when I do, I don't even go close to swamps, as it means instant fps drop/death.

Just tell me how i can help

Tarous
Graydon
Posts: 1605
Joined: Sun Mar 21, 2004 5:10 pm

Post by Graydon »

That's a new piece of information for me Evilzen... every game i've ever played of recon on 1.5 on PMnet.... I've never heard of anyone complain of FPS drop in swamps.... and quite honestly.. I see a whole lot more ww2 maps being played than reg these days... more so in ranked than unranked of course. In v1.4 on my 233mhz imac I would get poor FPS in swamps.... but with a 1.25ghz G4.. with whatever the built in vid card it has is... and I get 500-1200 FPS in swamps... so i dunno what you guyz are bishin ab00t.... As CIK said... none of us can replicate it on our hardware... testers nor coders... If you want to make a difference perhaps you can sign up to be a tester and help these guys out with working towards a fix, and help your dwindling buddies on PMnet by giving em better FPS in 1.5.1... good thoughts?

If you cant take the time.. I wouldnt have bothered posting about it :)
Image
User avatar
Baron LeDant
Posts: 364
Joined: Mon Mar 22, 2004 9:14 pm

Post by Baron LeDant »

btw EZ, autumn recon doesn't suffer anywhere near as bad in swamps, so maybe get people to switch to that if they are having issues
I wear a plastic ass on my bag ~ Da Cheeze
sillek
Posts: 128
Joined: Mon Mar 22, 2004 3:03 am
Location: US
Contact:

Post by sillek »

I wonder what FPS you would get if you stuck an old Voodoo card in a modern day machine ? I heard that Myth was built for Vodoo cards and other older cards. The technology in newer nVidia, ATI cards is much different and doesn't work as fast as it could.

I personally have never experience a drop in FPS in any swampy area either on this computer [800MHz iMac G4] or my older computer [233MHz iMac G3]. The differece between a grassy area and a swampy area is only around 1-3 fps on average. The only time water ever lowered my FPS was in those early 1.4 betas where Mac users were getting killed by water.

Your best bet in getting this fixed would be to coordinate some testing with somebody working on the patches [CIK, Myrd] either here, through e-mail, or on Hotline [hl.udogs.net]. I don't know what you could do for them, they'd know that not me, but I'm sure there is some kind of testing that could help them ..
Evilzen
Posts: 27
Joined: Wed Nov 03, 2004 10:17 pm
Location: NJ

Post by Evilzen »

Well i would be glad to help you guys. Graydon i never see you playing anymore and maybe you should ask around a bit. Everyone i know has a problem. Thats why more players from reich left. I have 3fps in swamps and nearly 80fps in plains. Autum recon doesnt help that much it still has the same effect cause all the leaves falling off the trees. I dont really like autum anyway its not on the level of the original recon. People make things to complicated. Now adays. Im going to be hosting a recon tourney. By december. If i can help you guys with anything let me know. thanks guys
best thing since sliced bread
User avatar
TarousZars
Site Admin
Posts: 565
Joined: Wed Mar 31, 2004 9:15 pm
Location: Utah, USA
Contact:

Post by TarousZars »

Graydon wrote:none of us can replicate it on our hardware... testers nor coders... If you want to make a difference perhaps you can sign up to be a tester and help these guys out with working towards a fix


I have that problem, and I'm already a beta tester.

I actually get consistantly better fps on my powerbook, which has a 32 meg vid card, than i do on my PC which has a 64 meg vid card. The PC can get around 80-90 normally, then drop to 3-4 in plains or areas w/ high projectiles. RotD Kills me bad if i stay mid, i have to do north or south. My laptop usually gets 20-30 fps, no matter where i go. Both are made by nVidia. Funky.

Just tell me what to test, and what to do with the results.

Tarous Zars




Edited By TarousZars on 1099609832
User avatar
Baak
Posts: 1109
Joined: Sat Mar 20, 2004 6:26 pm
Location: Mything

Post by Baak »

I may be way off-base here - and if so just let me know - but is this the same or similar to the "water on some maps slows everything down" bug in 1.5 that I believe has been fixed in the builds since?

I know we (our order that plays the public 1.5 weekly) has seriously problems with water on maps such as Picket Fences and especially Green Paradise (that has water in the center).

I know watching films of these games can quickly turn into a slide show when panning/viewing areas of water - sometimes with the effect that it doesn't go away until reloaded.

Again - if this is just confusing things then ignore - but I thought this particular bug had been addressed since the release of 1.5 - perhaps it's the same or similar?

I'm on a 1MHz AMD PC with 512MB RAM and 32MB ATI Radeon SDR. I play on WinMe but see the same problems when viewing films on Win2KPro in the public release 1.5.
Alan
Posts: 8
Joined: Fri Jun 18, 2004 10:43 am

Post by Alan »

Here's what I average on two different comps, myth set at 1024 res:

--Win2k on Athalon 900mhz, ATI Radeon 7200 64mbDDR, 512ram--

Hardware mode:
40-70fps plains
5-10fps swamps

Software mode:
20-30 plains
15-25 swamps


--Win2k on Pentium 4 3.3ghz, ATI Radeon 9800 pro 128mbDDR, 1gigram--

Hardware mode:
200-220fps plains (450+ in some spots)
15-20 swamps

Software mode:
160-180 plains
115-140 swamps
Evilzen
Posts: 27
Joined: Wed Nov 03, 2004 10:17 pm
Location: NJ

Post by Evilzen »

I have a 1.3ghz athlon 512mb ddr ram 128mb geforce FX 5200
I get about 40-80 plains
3-8 swamps
If i had a trow i get 120-140 fps
best thing since sliced bread
User avatar
Baak
Posts: 1109
Joined: Sat Mar 20, 2004 6:26 pm
Location: Mything

Post by Baak »

This may be simplistic/naive, but I have a suggestion:

How hard is it to have a Preference that says: "Use New Video Rendering", and if it is not checked (which would be the default) just use the old rendering code for 1.3?

Just separate into two.

This is not like separating unit/object behavior that would cause OOS issues, right? None of this is going over the network - just rendering locally, right?

If the video rendering is causing problems for a large number of people where it didn't before, whereas the new rendering is helping some, seems to me that instead of attempting to create new rendering that works for every possible platform/video card it would be much simpler to just say: "Use New" or "Use Old" - where the default is to use the old.

Is this a ridiculous idea?
Evilzen
Posts: 27
Joined: Wed Nov 03, 2004 10:17 pm
Location: NJ

Post by Evilzen »

Sounds good cause some people play in software mode to be able to deal with the fps probs. I dont see why it wouldnt work. Dunno how to do it but i think it could be done.
best thing since sliced bread
User avatar
TarousZars
Site Admin
Posts: 565
Joined: Wed Mar 31, 2004 9:15 pm
Location: Utah, USA
Contact:

Post by TarousZars »

Myrd asked me to do some 1.3/4/5 tests. I tested this on a 2.4 Ghz Pentium 4 w/ 512 RAM nVidia gForce4 64 MB Vid Card on WinXP Home.

Here are the results
North Hill
1.3 1.4 1.5

1280x1024 H: 128 fps 98 fps 80 fps
1280x1024 S: 98 fps 70 fps 47 fps
800x600 H: 128 fps 106 fps 80 fps
800x600 S: 128 fps 106 fps 67 fps

Swamps
1.3 1.4 1.5

1280x1024 H: 12 fps 5.9 fps 7 fps
1280x1024 S: 31 fps 41 fps 25 fps
800x600 H: 13 fps 8 fps 7.2 fps
800x600 S: 44 fps 34 fps 29 fps

Please note though, While i did get better fps in 1.3 I had such nasty input lag, that it was unplayable. The fps stayed constantly at 128 on north hill, but it took me 30 seconds to select a unit. Not Close to playable. I'd sacrifice 40 fps in a second, to actually be able to move. 1.4 suffered Miner input lag, playable, but still pretty crappy. And as you can see. they all totally sucked in swamps.

My swamps scenerio wasn't super scientific. I basically took 16 soldiers to swamps, and blew up the firenade pile. The fps you see is the lowest number I saw appear.

So while swamps sucks, and if you can fix it, I'd love you, I still think it is better with 1.5
Evilzen
Posts: 27
Joined: Wed Nov 03, 2004 10:17 pm
Location: NJ

Post by Evilzen »

untill 1.5 it was playable on my machine. ive never had this prob since 1.5 neither has anyone else.
best thing since sliced bread
Post Reply