I host an online game on a machine thats operating system is Windows Server 2012 R2 64bit. Its a single executable that is constantly running. I used to host this very same game on a Windows 7 machine and the ram usage of the game was around 350mb and didnt raise or drop down no matter how many days the game was running without a system restart. But now at Windows Server 2012 R2 the RAM usage of that online game seems to be raising all the time. It is going up day by day, hour by hour. After 9 days it has already gone up to 1GB and will most likely be going up (no restart has been done in these 9 days). So is this some feature or way how Windows Server 12 handles memory usage of programs because this did not happen when I hosted the game on a Windows 7 machine?
Most likely an incompatibility between the game and the new Windows version.
Windows Server 2012 R2 uses a Windows 8 kernel. (post Windows 7)
Alright, Server 2012R2 is very much like Windows 8. But not much changed between the two versions.
Except we got the Lego Duplo interface, even on Server.
You can, however launch a Windows 7 VM via Hyper-V and run the server there.
Of course this is not ideal for game servers.
If you can, switch to Linux. It would eat less memory, cost less money (a lot less). Of course, this is only a viable solution if your games (the ones you host) have Linux binaries.