问题描述
我最近被一个奇怪的想法袭击采取从/ dev / urandom的输入,转换相关的字符随机整数,并使用这些整数作为RGB / X-Y值的像素画到屏幕上。
我已经做了一些研究(这里计算器和其他地方)和许多建议,你可以简单地写到/ dev / fb0设备直接,因为它是设备的文件重新presentation。不幸的是,这似乎并没有产生任何可见的明显效果。
我发现了一个C程序示例,这是从使用的mmap将写入到缓冲器的QT教程(不再可用)。该程序成功运行,但是也没有输出到屏幕上。有趣的是,当我把我的笔记本电脑进入暂停,后来恢复,我看到被写入到帧缓冲早得多的图像(红色方形)的瞬间闪光。请问在Linux中不再写入帧缓冲工作绘画到屏幕?理想情况下,我想编写一个(BA)sh脚本,但C或类似的也行。谢谢!
编辑:这里的示例程序...可能很熟悉兽医
的#include<&stdlib.h中GT;
#包括LT&;&unistd.h中GT;
#包括LT&;&stdio.h中GT;
#包括LT&;&fcntl.h GT;
#包括LT&; Linux的/ fb.h>
#包括LT&; SYS / mman.h>
#包括LT&; SYS / ioctl.h>诠释的main()
{
INT fbfd = 0;
结构fb_var_screeninfo vinfo;
结构fb_fix_screeninfo finfo;
长整型屏幕大小= 0;
字符* FBP = 0;
INT X = 0,Y = 0;
长整型位置= 0; //打开文件进行读取和写入
fbfd =打开(的/ dev / fb0设备,O_RDWR);
如果(fbfd == -1){
PERROR(错误:无法打开帧缓冲设备);
出口(1);
}
输出(帧缓冲设备被成功打开\\ n); //获取固定的屏幕信息
如果(的ioctl(fbfd,FBIOGET_FSCREENINFO,&安培; finfo)== -1){
PERROR(错误读取固定信息);
出口(2);
} //获取变量屏幕信息
如果(的ioctl(fbfd,FBIOGET_VSCREENINFO,&安培; vinfo)== -1){
PERROR(错误读取可变信息);
出口(3);
} 的printf(%DX%D,%DBPP \\ n,vinfo.xres,vinfo.yres,vinfo.bits_per_pixel); //找出屏幕的字节大小
屏幕尺寸= vinfo.xres * vinfo.yres * vinfo.bits_per_pixel / 8; //映射设备内存
FBP =(字符*)MMAP(0,屏幕尺寸,PROT_READ | PROT_WRITE,MAP_SHARED,fbfd,0);
如果((int)的FBP == -1){
PERROR(错误:未能帧缓冲设备映射到内存);
出口(4);
}
输出(帧缓冲设备映射到内存成功\\ n); X = 100; Y = 100; //我们要去的地方把像素 //计算出在内存放像素
对于(Y = 100; Y< 300; Y ++)
为(X = 100; X< 300; X ++){ 位置=(X + vinfo.xoffset)*(vinfo.bits_per_pixel / 8)+
(Y + vinfo.yoffset)* finfo.line_length; 如果(vinfo.bits_per_pixel == 32){
*(FBP +位置)= 100; //一些蓝色
*(FBP +位置+ 1)= 15+(X-100)/ 2; //一个小绿
*(FBP +位置+ 2)= 200-(Y-100)/ 5; //很多红
*(FBP +位置+ 3)= 0; //没有透明度
//位置+ = 4;
}其他{// 16bpp的假设
INT B = 10;
INT G =(X-100)/ 6; //一个小绿
INT R = 31-(Y-100)/ 16; //很多红
无符号短整数T = R<< 11 | G<< 5 | b:
*((无符号短整型*)(FBP +位置))= T;
} }
在munmap(FBP,屏幕大小);
关闭(fbfd);
返回0;
}
在此输入code
如果你正在运行X11,你必须通过X11 API来绘制到屏幕。绕来绕去的X服务器很破(和,经常你所看到的,不工作)。它也可能造成死机,或者只是一般的显示器损坏。
如果您希望能够到处运行(包括主机和功放;在X),看SDL或GGI。如果你只在乎X11,你可以使用GTK,QT,甚至Xlib的。有许多,许多选项...
I was recently struck by a curious idea to take input from /dev/urandom, convert relevant characters to random integers, and use those integers as the rgb/x-y values for pixels to paint onto the screen.
I've done some research (here on StackOverflow and elsewhere) and many suggest that you can simply write to /dev/fb0 directly as it is the file representation of the device. Unfortunately, this does not seem to produce any visually apparent results.
I found a sample C program that was from a QT tutorial (no longer available) that used an mmap to write to the buffer. The program runs successfully, but again, no output to the screen. Interestingly enough, when I placed my laptop into Suspend and later restored, I saw a momentary flash of the image (a red square) that was written to the framebuffer much earlier. Does writing to the framebuffer work anymore in Linux for painting to screen? Ideally, I'd like to write a (ba)sh script, but C or similar would work as well. Thanks!
EDIT: Here's the sample program...may look familiar to vets.
#include <stdlib.h>
#include <unistd.h>
#include <stdio.h>
#include <fcntl.h>
#include <linux/fb.h>
#include <sys/mman.h>
#include <sys/ioctl.h>
int main()
{
int fbfd = 0;
struct fb_var_screeninfo vinfo;
struct fb_fix_screeninfo finfo;
long int screensize = 0;
char *fbp = 0;
int x = 0, y = 0;
long int location = 0;
// Open the file for reading and writing
fbfd = open("/dev/fb0", O_RDWR);
if (fbfd == -1) {
perror("Error: cannot open framebuffer device");
exit(1);
}
printf("The framebuffer device was opened successfully.\n");
// Get fixed screen information
if (ioctl(fbfd, FBIOGET_FSCREENINFO, &finfo) == -1) {
perror("Error reading fixed information");
exit(2);
}
// Get variable screen information
if (ioctl(fbfd, FBIOGET_VSCREENINFO, &vinfo) == -1) {
perror("Error reading variable information");
exit(3);
}
printf("%dx%d, %dbpp\n", vinfo.xres, vinfo.yres, vinfo.bits_per_pixel);
// Figure out the size of the screen in bytes
screensize = vinfo.xres * vinfo.yres * vinfo.bits_per_pixel / 8;
// Map the device to memory
fbp = (char *)mmap(0, screensize, PROT_READ | PROT_WRITE, MAP_SHARED, fbfd, 0);
if ((int)fbp == -1) {
perror("Error: failed to map framebuffer device to memory");
exit(4);
}
printf("The framebuffer device was mapped to memory successfully.\n");
x = 100; y = 100; // Where we are going to put the pixel
// Figure out where in memory to put the pixel
for (y = 100; y < 300; y++)
for (x = 100; x < 300; x++) {
location = (x+vinfo.xoffset) * (vinfo.bits_per_pixel/8) +
(y+vinfo.yoffset) * finfo.line_length;
if (vinfo.bits_per_pixel == 32) {
*(fbp + location) = 100; // Some blue
*(fbp + location + 1) = 15+(x-100)/2; // A little green
*(fbp + location + 2) = 200-(y-100)/5; // A lot of red
*(fbp + location + 3) = 0; // No transparency
//location += 4;
} else { //assume 16bpp
int b = 10;
int g = (x-100)/6; // A little green
int r = 31-(y-100)/16; // A lot of red
unsigned short int t = r<<11 | g << 5 | b;
*((unsigned short int*)(fbp + location)) = t;
}
}
munmap(fbp, screensize);
close(fbfd);
return 0;
}
enter code here
If you're running X11, you MUST go through X11 APIs to draw to the screen. Going around the X server is very broken (and, often as you've seen, does not work). It may also cause crashes, or just general display corruption.
If you want to be able to run everywhere (both console & under X), look at SDL or GGI. If you only care about X11, you can use GTK, QT, or even Xlib. There are many, many options...
这篇关于漆像素通过Linux的帧缓冲屏幕的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!