Jump to content

Function of "autoperf"?


Shenanigunner

Recommended Posts

Does the autoperf {0|1} slash command have any function? It's still accepted as a valid command but doesn't seem to do anything.

UPDATED: v4.15 Technical Guide (post 27p7)... 154 pages of comprehensive and validated info on on the nuts and bolts!
ALSO:  GABS Bindfile  ·  WindowScaler  ·  Teleport Guide  ·  and City of Zeroes  all at  www.Shenanigunner.com

 
Link to comment
Share on other sites

Per the wiki:  Automatically change world detail for performance.

 

So, reduce resolution for better performance I guess.  If you're on a decent computer, you have power to spare so I wouldn't expect it to have to compromise on graphics.

Edited by Ironblade

Originally on Infinity.  I have Ironblade on every shard.  -  My only AE arc:  The Origin of Mark IV  (ID 48002)

Link to the story of Toggle Man, since I keep having to track down my original post.

Link to comment
Share on other sites

Right, I know what the wikis and other lists say... just that one cryptic line that (like many other such "who effing knows" functions) doesn't really give any useful info. It does nothing on my (fairly powerful) system; I guess I'd like to hear from anyone running it on, say, a marginal laptop. Default is 0; set /autoperf 1 and see if it changes rendering, frame rate, etc.

UPDATED: v4.15 Technical Guide (post 27p7)... 154 pages of comprehensive and validated info on on the nuts and bolts!
ALSO:  GABS Bindfile  ·  WindowScaler  ·  Teleport Guide  ·  and City of Zeroes  all at  www.Shenanigunner.com

 
Link to comment
Share on other sites

Well while I'm not by any stretch tech savvy this game is old, as in dinosaur prehistoric old.  When I first went to download the game via a wonderfully throttled on a hotspot speeds a follow RO network player pointed out it was designed to run on, ahem, dial up.  I never noticed any issues loading.  While some graphics settings were added over the years it still very light on what computers and monitors can handle these days I'm fairly certain.  Even Emmy's ancient laptop is, what, a year in the future from when Live went into the Sunset.  So unless the HC team has done some massive overhaul of the games graphics I missed ...

Edited by Doomguide2005
Lousy math
  • Like 1
  • Haha 1
Link to comment
Share on other sites

5 hours ago, Ironblade said:

Per the wiki:  Automatically change world detail for performance.

 

So, reduce resolution for better performance I guess.

I doubt it has to do with resolution, adaptive resolution scaling wasn't really a thing back in CoH's heyday and has come more into vogue recently. I could be wrong, but it sounds more like it would adjust things like LoD (Level of Detail) and environment clutter (though it could be argued as part of LoD).

 

Basically render either less stuff and/or less detailed stuff in the world if performance is low.

 

5 hours ago, EmmySky said:

My computer is a bit of a toaster, and I noticed no change.  Even had my graphics options tab open, saw no difference.  I already have it set fairly low to work with my machine so maybe thats why.

Well, you can get bad gaming performance for a number of reasons:

 

CPU Bottlenecks:

- Simulating the game world for a frame. Typical things that can slow this down are physics, lots of entities nearby, etc. Things that could help this would be turning off settings and lowering the draw distance (game might calculate less if not visible).

- Rendering a frame involves figuring out what needs to be drawn then giving the GPU instructions to perform. Each command takes a fixed amount of time, but the amount of work done per command can vary. It's a lot cheaper to say, issue one "draw these 1 million polygons" than "issue one million draw calls to draw 1 polygon each." This can be partially mitigated by smart engine design, but it can only go so far, and OpenGL (which CoH uses) is pretty heavy. Again, a way to mitigate this would be to turn off settings (now lower, but off) and reducing the draw distance.

 

GPU Bottleneck:

- The typical thought is that a GPU gets bottlenecked by pixels, and that is true. Reducing the resolution can help in this case, but there are other ways it can get strangled. Drawing more / higher quality shadows are - in a way - pixels. Scenes with a lot of alpha effects (translucent objects) are pixels that weren't culled.

- Drawing too many vertices/polygons can saturate that portion of the card.Drawing less objects or lower quality object with less polygons can help this.

- There is general compute power used in newer games (doing calculations on the GPU itself, things like GPU particles, etc).

- Memory bandwidth, transferring data between the CPU and GPU. This could be for things like loading in textures.

- Memory starvation, if you try to put more data on the GPU than it has memory to store it. This will likely cause data to be swapped in/out of GPU memory. This is slooow. Imagine loading & saving stuff to the hard drive constantly in the middle of an intensive calculation

 

HDD Bottleneck

- If the game requires retrieving a lot of data from the hdd (like traveling across a zone and needing to grab textures used on one side and not the other, or a player spawns in with costume pieces that you don't currently have loaded).

Or any combination of those. Note  that some of these issues can be exasperated by engine design - something that the players have no control over.

 

 

I guess what I'm saying is, it's complicated... not sure if this was helpful, though.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...