Again sorry for my bad spelling hope this will help people and people googling this issue will find this to help them out. If there is something more that you could add here just let me know and i will edit this post. Other than those, I didn’t see any bugs or issues, and the game looks fantastic. I pre-ordered the game already so, I may end up building a new PC to fix my CPU issues… we will see. So, in terms of CPU vs GPU, TERA is a very very CPU intensive game.
The only reason to edit S1Option.ini would be if you wanted to change an in-game graphic/interface option outside of the client. You can open it up and take a look at the settings and you’ll understand what I mean. Now, wouldn’t you say TERA has a very weak player/mob max view distance? Some of the foilage at max setting shows up before players/mobs do.
And the further away you get from an object, the worse the resolution becomes. But you should ideally not notice a difference, because your sufficiently far from the texture. So I had to fiddle with the compatibility setting to see if Nvidia had truly selected the correct SLi profile. Make sure your using fraps or another program to see your FPS. Let go of your mouse, and give your FPS seconds or so, and take note of what it is. So in summary, a single GTX460 should be able to play this game on max with FPS no problem.
If anyone has any ideas on why viewing additional mobs would create such massive CPU usage, please enlighten me. Because that one setting is the worst for me. And its the most important setting as well (for pvp atleast). If I leave a line out, its because I don’t know what it does, or changing it made no difference to my client’s game display. S1Option.ini is just a notepad file of your in-game chosen options.
If u have any perforamce issues in Tera this will solve at least 95% of it.
The texture will go from high detail, to lower detail, as you move closer to it. Then grab your mouse and move it in circles. (If your CPU is the bottleneck.) For some reason, moving the mouse or giving the game client some kind of user input, is a huge killer to your framerate. I have played games in the past such as Ultima Online that had a “Run mouse in a seperate thread” setting.
- Out of all the testing I did, tweaking settings over and over.
- You can open it up and take a look at the settings and you’ll understand what I mean.
- So I had to fiddle with the compatibility setting to see if Nvidia had truly selected the correct SLi profile.
- But, I fail to see how my old Pentium 4 3.06ghz single core processor, could run Dark Age of Camelot with 50+ people on the screen just fine, with my GPU being the bottleneck.
- Their nameplate will get blurry and difficult to read unless your right up next to them.
Enter the 6-digit code from your authenticator app
P.S. My ram wasn’t the issue, TERA never broke 1.3 GB of ram usage on the noob island. And with Vista, programs get a default 1.9 GB to use, or 2.9 GB or so to use if you use a UserVA tweak. So what I do as a temporary fix is I run GTX460 #1 as the Single GPU. And use the other GPU as the SLi Anti aliasing unit. There are two files you can edit that will alter your graphics options.
Top Posts
After all this testing, I concluded that no other compatibility setting exceeded the default enough to warrant changing it. So I left it at the default, and gave Nvidia the benefit of the doubt. So, if you want to have any real form of AA and you are an Nvidia user, you have to set AA compatiblity to 0x000100C5 in the Nvidia Inspector. En Masse if you’re reading this, I’d recommend you give the players a real AA option and use this hex code. So some of you might ask, “Why force AA when you can just change “Lighting Enrichment” option to 2 and get it.” Well that is not real AA.
But my quad core, newer, higher speed CPU can’t handle more then 15 mobs/players on the screen at once, without the FPS dropping significantly. So since a single one of my GTX460’s can handle TERA alone, I figured I could get away with using Ambient Occlusion, which is, to my knowledge, a completely GPU powered task. Unfortunately, Nvidia Control panel has no force Ambient occlusion on setting for TERA. However, there is an Ambient Occlusion True/False setting in the S1Engine.ini file. And my other Unreal 3 Engine game has the same command in its .ini file in the SystemSettings section, and its ambient occlusion works just fine. Ok let me start off by saying I spent literally 20+ hours turning .ini settings on and off, and adjusting and testing Nvidia Inspector options.
Go up to a player and look at their nameplate, set Enrichment to 1, then change it to 2. Their nameplate will get blurry and difficult to read unless your right up next to them. There is no FPS loss to my knowledge from turning Enrichment 2 on. But all it is doing is just blurrying your game, minus the interface. So to summarize, normally the way game graphics work is the textures get mipmap / Level-Of-Detail modes.
Everytime I made a change I had to restart the client. I ran my game with both GTX460’s in SLi, with the default AF2 (alternate frame rendering) setting. AF2 is the optimum setting for almost every game I’ve played. So, in summary, there is no need to change the SLi compatibility setting as of now.
If you checked it, your mouse cursor would go from lagging hardcore, to very nice and smooth. So my assumption is that moving your mouse around, or opening a menu, or any other user input, is very intense on the game client’s thread processes. This is most likely a LOD bias / coding bug, because textures should ALWAYS become higher resolution the closer the camera gets to it. The highest LOD level should never be 20 feet from the object, but the entire 0-20 feet from the object range. And you will be seeing it at the 2nd best resolution level.
And if you set it to 6 (max) you can see mobs about 3-4x as far. And going from setting 1 to setting 6, gives me a 15 FPS DROP. For some reason, viewing 30 more yards of mobs, kills my FPS. What exactly is changing in that code thats making my CPU slow down my FPS by 30% or so? But there is no reason that viewing additional mobs should kill a game’s FPS that much.
There could be a better profile, but I haven’t found it, and its not an Unreal 3 Engine game compatibility setting, so good luck finding it. This has many sections that are universal to many Unreal 3 Engine games. I will list all of the settings I’ve edited myself and what I experienced from doing so. So when you get into that feet from the texture zone, you get max resolution, but if you zoom your camera closer to the texture, it will eventually “pop” back into B resolution mode. I noticed an error when playing the game at Texture Resolution setting 0, or 1. (2 being the max) Im pretty sure any player can replicate this issue if they turn down texture resolution from max.
- Everytime I made a change I had to restart the client.
- So to summarize, normally the way game graphics work is the textures get mipmap / Level-Of-Detail modes.
- And if you set it to 6 (max) you can see mobs about 3-4x as far.
- (2 being the max) Im pretty sure any player can replicate this issue if they turn down texture resolution from max.
- But if anyone got Ambient Occlusion to work, please share how you did it.
- So put the setting on 0 or 1, and zoom your camera away from your character maybe halfway between max camera distance, and right over your shoulder.
I believe its because of this FPS killing code that additional mobs creates. Textures are higher resolution, and models are higher polygon counts, then they were 15 years ago, yes. But, tera vsync I fail to see how my old Pentium 4 3.06ghz single core processor, could run Dark Age of Camelot with 50+ people on the screen just fine, with my GPU being the bottleneck.
Comments