Lol 320 fps players must be running around like methed up retards. Set that your framerate to 60 fps if you have a 60 hz monitor anything over 60 fps you are not seeing anyway.

    jerc haha nono , it plays totally diff m8. But i would set it to 120-180or240

    You really don't and your link confirms that but if you like UT looking mental and working your video card harder than it was meant to ect... Go for it.

    There are plenty of examples on the net explaining why high rehfesh rate is better but until you experiece for yourself it's a pointless argument. I game at 240hz with my frames locked at 200. Playing at 60fps is like looking at a slide show for me.

    What are our limits?
    No idea, but our eyes do not see in hz or FPS our brain creates our own natural motion blur for fast moving things that it can't keep up with.

    I found a great article a while ago explaining the misinformation that has spread round the net. I'll try to dig it up 👍

      I have a 165hz monitor with frames locked to 165 and it's a big difference from when it was running at 198 fps due to xopengl issue I had. If you are gonna mix and match probably better to have the refresh rate higher than fps.

      Oh and my 1660 ti was struggling to stay cool and I believe it's a bit more powerful than a gtx550.

      i only have a 60hz monitor 🙁

      19 days later

      Mine is a laptop. Screen locks at 60hz and when it heats up goes down to 40 so... yeah life is good

      6 days later

      CPU: I7 6700K overclocked to 4.6ghz 4 cores 8 threads.
      GPU: GTX 970 3.5gb
      RAM: 24GB DDR4 2400mhz
      Mouse: Logitech g305 wireless, 400DPI, 1000hz, 0.5 in game using raw input.
      Keyboard: ACER SK9611 (Bought for £1.75 from a charity shop, never been used)
      Monitor: ASUS VG248 144hz 1920x1080

      Video Driver: OpenGL (NOT XOpenGL)
      Vsync: Off
      Decals: Off
      Dynamic Lighting: Off
      Gore Level: Ultra Low
      Weapon: Hidden.

      FPS pretty much locked to 312fps(average 310-312 for a whole match) with netspeed 20000, played also at 40000 netspeed and fps limits to 624 but not as rock solid consistency.

      FPS online is netspeed / 64, unless set lower by your video drivers max frame rate.
      Netspeed 5000 is 78.125 max fps.
      Netspeed 10000 is 156.25 max fps.
      Netspeed 15000 is 234.375 max fps.
      Netspeed 20000 is 312.5 max fps.
      Netspeed 25000 is 390.625 max fps.

      15 days later

      AMD Ryzen 9 5950X
      G.Skill Trident Z RGB DIMM Kit 32GB, DDR4-3600, CL14-15-15-35
      AMD Radeon™ RX 6800 XT // 6900 XT // RTX 3090 (testing around)
      2x 1TB WD Black SN750 Gaming M.2 2280 PCIe 3.0 x4
      Mobo: X570-F Gaming
      Mouse: Logitech G302 - Polling Rate: 500 Hz (feels snappier)
      Keyboard: Ducky One (Red Switches)
      Display: 32" Curved Gaming Monitor Odyssey G7
      https://www.samsung.com/ch/monitors/gaming/odyssey-g7-32-inch-240hz-1ms-curved-lc32g75tqsrxen/

      (Monitor-Settings)
      Refresh-Rate: 240Hz
      Black Equalizer: 1
      Response Time: Fastest
      Freesync: Off
      Low Input Lag: On
      Picture-Mode: FPS

      Video-Renderer: Switching between d3d9 and Vulkan (actually Vulkan API)
      Patch: 469c
      Vsync: as always OFF
      Decals: Off
      Dynamic Lighting: Off
      Gore Level: Ultra Low
      Weapon: Hidden
      FOV: 105-110

      Netspeed: 100000 (however the cap is :>)
      Lanspeed: 1000000 (just to do some unlimit)

      FPS showing constantly 999 (maybe some advice of Mapvoting etc.)

      Windows 11 Enterprise - tweaked & debloated
      Internet & Network: unfortunately just 300mbit VDSL2

      My Nicknames in Game: «T3ddy» or noob©² :> or any others.

      History: Clanbase.ggl (yes iam old :>), Nations Cup (played for Team.Germany), Hosted Cup, Europe Cup etc.) mostly iCTF

      5950X - 3080ti - 32gb 3800 cl14
      monitor asus pg279qm 2560x1440@240hz
      mouse logitech g pro x superlight
      keyboard corsair k100

      completely overkill for this 23 year old game

      2 months later
      • i7 9700k
      • Gigabyte mb (only 1 I could get at the time)
      • Win10 Home (retail)
      • Zowie 24" 144hz primary monitor
      • ACER 24" 60hz secondary
      • 16g DDR4 - 3200
      • RTX 3070 using DX11 currently
      • 500G M2 nvm-e primary drive
      • 250G EVO 850 SSD
      • 2TB WD drive
      • 4TB WD external USB 3.0 for backups
      • Razer DeathAdder Essential

      Way more than UT needs, but, using DX11 uses ~ 50-55% GPU on the 3070. OpenGL uses less than half of that. Built this back in 2020, when everything was hard to find, esp GPU's.

      OpenGL is a lot better then it used to be, especially as it uses fullscreen correctly now like direct3d9 did, making it feel super snappy and responsive.

      • [deleted]

      Processor 11th Gen Intel(R) Core(TM) i9-11900K @ 3.50GHz 3.50 GHz
      Installed RAM 32.0 GB (31.8 GB usable)
      System type 64-bit operating system, x64-based processor
      1080 on 120hz
      Edition Windows 11 Pro
      Version 21H2
      Gforce Rtx 2060

      It's all about the internet, Mouse and tastarure no name 🙁 (that's a problem)

      AMD Ryzen 5 1600X Six-Core Processor 12 Threads 3.60 GHz
      16GB G.Skill RGB DDR4 3600
      NVIDIA GeForce GTX 1660 Ti
      Mouse Swiftpoint Z 700
      Monitor AOC 32" 165hz
      Keyboard AZIO MGK L80

      a month later

      Main PC died 🙁(motherboard seems to be fucked)

      (Server/Backup PC)
      I5-2400
      12GB DDR3 RAM
      No Dedicated GPU(CPU coolers blocks the pci slot atm lol)

      I must of over tightened the cpu stuff at some point because I noticed the CPU is slightly bent...

      At least everything else is fine, just need to get another CPU/Motherboard which isn't that expensive to get close to what I had or better.

        Zim 240Hz (240 times each pixel of the monitor is refreshed in one second) and taking into account these 60 fps (The graphics card throws out 60 individual images, also in one second). Simplifying it, it turns out that the monitor will refresh its pixels four times for one image generated by the graphics card. All this tends to the fact that the monitor with 240Hz will be "limited" by the FPS setting on the graphics card, which is 60 here. An expensive matter for how we perceive the image is the issue of synchronizing the frames per second of the graphics card with the Hz on your screen. You've probably heard of VSync, GSync, FreeSync. Technologies that try to synchronize these two frequencies. The problem is that when we turn on VSync, the frames generated by the graphics "wait" at times to be displayed on the screen, so they are not always displayed as quickly as we would like. The second, potentially more serious drawback is the problem of "choking" the image - VSync helps in situations where the number of frames produced by the graphics card is too large compared to what the monitor wants to display. In the opposite scenario, where the graphics are not able to generate the 240 frames expected by the display, because more is happening on the screen than the GPU is able to process. Frame drops become much more noticeable with VSync enabled. This causes image stutter, which can be much more annoying than tearing.
        Nvidia decided to approach the synchronization problem from the ass - instead of forcing the graphics card to synchronize with the frequency of the monitor. G-sync technology makes the monitor try to match its refresh rate to the number of frames produced by the graphics.
        As a result, frame drops are much less noticeable, as the monitor transitions quite smoothly between their different values. Delays in response to keystrokes are imperceptible.
        FreeSync is AMD's answer to Nvidia's GSync solutions. Basically it's the same.

        Fun fact.
        What is the optimal frequency in first person shooter games so that we can react relatively quickly or even see some movement? World research shows that it is only from 7 to 13 Hz, i.e. 7 to 13 images in one second. This means that while playing, watching TV or even in real life, the remaining frames are just an interpretation of our brain.
        It is also interesting that professional film studios use a 24-frame standard. An exception is, for example, the trilogy of films about Hobbits by Peter Jackson. He decided to record it at 48 fps.
        Regards.

        you could probably get a graphics card dumpster diving that will throw out more than 60 fps of UT99.

          jerc My GPU and everything minus the motherboard/cpu is still fine, the CPU coolers is complete overkill for an i5-2400 it struggles to get into 30C+ range and blocks the slots for a GPU.

          But I am surprised with the performance of the integrated gpu, can get over 200fps at 1080p settings, but limited to 60hz even on a 144hz.