new “options” are always welcome but please dont abandon MSAA.
Why on earth would you completly remove the only “proper” AA-method that we have in videogames today for another post-processed-based AA of which we already got 2 (FXAA + SMAA). I dont think that most of the playerbase still uses some AMD Athlon + Geforce2 from around 2000s and your game isnt that demanding by todays standards. Also my 2cents on your comparison video of the techniques, pls dont try to fool people…
You choose a scene (slow camera sway, no fast movements and probably recorded @ high framerates) that “compliments” TAA which is known for ghosting and blurriness especially @ low framerates->(which are the targeted players of this AA method because they wanna get any frame they can squeeze out)
So none of the players with high framerates, who in their right mind, would use FXAA (has improved over the years but still likes to eat up smaller details) or SMAA (IMO the best post-process-AA) when they could have good old MSAA that produces the “best” results in terms of visual clarity + a crisp image.
The change was probably made to reduce lag.
The rendering happens on the clientside and is not affected by server performance or connectivity. The point is, like in every other game, you adjust the GFX-detail settings to the point where your machine can handle it, no big deal or something special… Im only on a Geforce 1060/6gb and run the game fine with 8xMSAA and Transparency AA at 120 fps in 1920x1080. I just dont get why they want to remove it completly. Just add TAA to the possible options (like they did with SMAA) and let the Players choose whats fine for them and works on their setup…
Weird how the humans eyes can only see 30 to 60 frames per second. Anything higher is just for stupids
It’s true. People can’t really see more than 60 frames per Seccond. So all the gamers hyping about their high frame rates are just kinda being sheep, who have been tricked in to thinking that those frame rates above 60 even matter. And those extra frames don’t.
dont really know what this has to do with the topic…
but also funny how people always feel the need to point out that the human eye starts seeing single frames as an fluid animation @ around 30fps and totally mussing the point → Input latency…
if you like playing games with vsync on @ 6ßfps and think thats great keep doing so…
never liked any kind of aa, too blurry all of em. read somewhere that super sampling is technically the best form of aa. give it a try, game looks alright for me without aa but super sampled at 125-150%.
ignore the other two ladies in here, theyre on consoles, indicated by the loose usage of the word lag for anything that isnt running smoothly and the inferiority complex displayed when it comes to higher fps.
Yeah yeah yeah pc this pc that.
ok this maybe explains that kind of strange response to the topic. you are right about supersampling but because of the performance cost they introduced multisample AA as a good “middleground”
guess you are right, if MSAA gets removed i will just have to mess around with the renderscale-option until i find a “sweetspot” that works for me…
I’m ps4. I had some lag issues in crossout at one point. But now I never lag. Why is that? I turned off anti aliasing. I get some screen tears. But Tbh I don’t even notice them. No lag. That’s what’s important.
I bet all these people complaining about lag keep anti aliasing on.
The new AA should cause significantly less lag issue. It’s a good change.
Cant really say much about consoles because the last ones i owned was back in the 90s (SNES/Playstation) but it should be very similar to PC “technique-wise”. When vsync is on the GFX-card only renders that much frames the Monitor/TV can show in one refresh cycle (60 hz / 60 fps) to get rid off screen tearing (frames are somewhere on the screen “shifted”) BUT when ever you drop below that 60fps the previous rendered frame is displayed again to avoid mentioned screentearing and that is what people notice as stuttering (lag, whatever you wanna call it). So i guess maybe with AA on you loose a frame here and there on console which results in the described effect. So its more the “fault” of the vsync technique then of the AA itself. I dont know if you can turn off vsync on console but this should “fix” the problem with the downside that you can get tearing because GFX-card and Display are not forced to work synchronized anymore.
Also i think you guys get me wrong here…
I not saying “dont implement TAA” (besides im thinking its one of the worst AA methods).
The only reason why i opened this topic its the point in the news where they stated that multisample-AA gets removed!
With games like these the vsync and how much particle density you have is a huge factor in how much lag occurs. So If on pc its best to turn all that crap off. On console we can’t modify say particle density of our games really. That’s probably the main reason console, as Devs have stated, report more occurrences of lag.
Ah well that explains a lot, i wasnt sure if todays consoles allow you to adjust some general detail settings because “hardware-wise” they are pretty much a standardised compact gaming PCs.
Decreasing game graphics now fixes internet/server related problems???
When did they change the graphics to be server sided???
many console players are no experts when it comes to how pcs work, lag means anything that isnt smooth. just read the posts, almost nothing checks out.
Except I’m right. Pc games I play have this same issue. Your making it sound like I’m talking nonsense. But lowering those setting or turning them off works. Your making it sound like I don’t know what I’m talking about. But have you tested it? And it doesn’t work how your mockingly saying I think it does. It really depends on the game and it’s engine as well.