本文介绍了命令行XRandR与自己的代码之间的差异的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我需要以编程方式获取监视器的刷新率.

I need to programatically get the refresh rate of a monitor.

当我在命令行上输入 xrandr (1.4.1,opensuse 13)时,我得到:

When I type xrandr (1.4.1, opensuse 13) on the command line I get:

Screen 0: minimum 8 x 8, current 1920 x 1200, maximum 16384 x 16384
VGA-0 disconnected primary (normal left inverted right x axis y axis)
DVI-D-0 connected 1920x1200+0+0 (normal left inverted right x axis y axis) 518mm x 324mm
   1920x1200      60.0*+
   1920x1080      60.0
   1680x1050      60.0
   1600x1200      60.0
   1280x1024      60.0
   1280x960       60.0
   1024x768       60.0
   800x600        60.3
   640x480        59.9
HDMI-0 disconnected (normal left inverted right x axis y axis)

除其他外, nvidia-settings -q RefreshRate 确认了此结果.

This result is confirmed by nvidia-settings -q RefreshRate, among other things.

但是...当我运行以下代码(来源: https://github.com/raboof/xrandr/blob/master/xrandr.c ),使用g ++ 4.8.1编译(带有-lX11 -lXext -lXrandr):

But ...when I run the following code (origin: https://github.com/raboof/xrandr/blob/master/xrandr.c), compiled with g++ 4.8.1 (with -lX11 -lXext -lXrandr) :

int nsize;
int nrate;
short *rates;
XRRScreenSize *sizes;
Display *dpy = XOpenDisplay(NULL);
Window root = DefaultRootWindow(dpy);

XRRScreenConfiguration *conf = XRRGetScreenInfo(dpy, root);
printf ("Current rate: %d\n",XRRConfigCurrentRate(conf));

sizes = XRRConfigSizes(conf, &nsize);
printf(" SZ:    Pixels          Refresh\n");
for (int i = 0; i < nsize; i++) {
    printf("%-2d %5d x %-5d", i, sizes[i].width, sizes[i].height);
    rates = XRRConfigRates(conf, i, &nrate);
    if (nrate)
        printf("  ");
    for (int j = 0; j < nrate; j++)
        printf("%-4d", rates[j]);
    printf("\n");
}

XRRFreeScreenConfigInfo(conf);

我得到:

Current rate: 50
SZ:    Pixels       Refresh
0   1920 x 1200   50
1   1920 x 1080   51
2   1680 x 1050   52
3   1600 x 1200   53
4   1280 x 1024   54
5   1280 x 960    55
6   1024 x 768    56
7    800 x 600    57
8    640 x 480    58
9   1440 x 900    59
10  1366 x 768    60
11  1280 x 800    61
12  1280 x 720    62

为什么我得到这个结果?我在做什么错了?

Why am I getting this result?What I am doing wrong?

该软件将OpenGL与GLEW配合使用.这会产生影响吗?我们确实调用了 glXQueryDrawable(dpy,drawable,GLX_SWAP_INTERVAL_EXT,& val),但此后,我认为这不会产生任何影响.

The software uses OpenGL with GLEW. can this have any influence?We do call glXQueryDrawable(dpy, drawable, GLX_SWAP_INTERVAL_EXT, &val) but afterwards, and I do not think this should have any influence.

推荐答案

我找到了答案:

如果XRandR服务器支持协议的1.2版,则需要使用适当的功能(我计划通过复制 https://github.com/raboof/xrandr/blob/master/xrandr.c 其中 has_1_2 是真的).​​

If the XRandR sever supports version 1.2 of the protocol, then the appropriate functions need to be used (wich I plan to do by copying snippets of code from https://github.com/raboof/xrandr/blob/master/xrandr.c where has_1_2 is true).

我在问题中的代码使用协议版本1.1的函数,因此仅返回元模式.

My code in the question uses functions for the version 1.1 of the protocol, and therefore only the metamodes are returned.

作为一个简单的检查,我尝试了以下两个命令:

As a simple check, I tried the following two commands:

xrandr --q1

xrandr --q12.

事实上,第一个给我的效果与我以编程方式得到的结果相同.

And indeed the 1st one gives me the same result I programatically get.

贷方进入 http://www .ogre3d.org/forums/viewtopic.php?f = 4& t = 65010& start = 200

这篇关于命令行XRandR与自己的代码之间的差异的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

09-18 06:42