Tremulous Forum
General => General Discussion => Topic started by: tehOen on July 15, 2007, 01:49:29 am
-
current svn uses 640x480 for the ui
this makes the ui look crappy. for example you are playing with res. 1024x768 -> all the menus are scaled by 1024.0/640.0 and converted to int
this means a precision lost (also the images)
ie it ruins the ui if you are using a resolution higher than 640x480
-
o'rly?
-
o'rly?
ya rly
I've got 2 monitors:
19" 1440x900
22" 1680x1050
I plan on buying a 24" and a 30", so upping the default resolution would be great.
-
Why not have it use variables to work with all resolutions since many people use r_mode -1 and r_customwidth xxxx and r_customheight xxxx ?
-
640x480 + low everything + r_picmip 7 = über 1337n355
-
640x480 + low everything + r_picmip 7 = über 1337n355
+Eleventyone
17 15 73[-] 12350||_|710/\/ [=012 1337 [o30[o|35 !1
-
640x480 + low everything + r_picmip 7 = über 1337n355
+Eleventyone
17 15 73[-] 12350||_|710/\/ [=012 1337 [o30[o|35 !1
I'm sorry, WHAT?
-
I kind of lose it after "It is teh". I'm obviously not leet enough. :(
-
my ui looks just fine whatever res i use. why not provide some images to prove your point?
-
start doing maths :roll:
I tested it looks better if you use a higher resolution for menu definitions
-
<- 1280x800. Voted "non 4:3"
-
1280X1050
non 4:3
-
current svn uses 640x480 for the ui
this makes the ui look crappy. for example you are playing with res. 1024x768 -> all the menus are scaled by 1024.0/640.0 and converted to int
this means a precision lost (also the images)
ie it ruins the ui if you are using a resolution higher than 640x480
Should I bother to quote the UI code just to prove you wrong or will everybody believe me that you have no idea how the UI code works and you just make stupid posts?
A little hint: OpenGL doesn't care about current resolution.
-
current svn uses 640x480 for the ui
this makes the ui look crappy. for example you are playing with res. 1024x768 -> all the menus are scaled by 1024.0/640.0 and converted to int
this means a precision lost (also the images)
ie it ruins the ui if you are using a resolution higher than 640x480
Should I bother to quote the UI code just to prove you wrong or will everybody believe me that you have no idea how the UI code works and you just make stupid posts?
A little hint: OpenGL doesn't care about current resolution.
prove it then you map 640x480 points to 1024x768 points this is not onto Thus you get holes
-
1280 x 854
-
854? wtf? lol
-
854? wtf? lol
Apparently (http://www.google.com/search?q=1280+854), it does exist. Who knew?
-
I know it exists, but I thought he played on a MacBook. (1280x800)
-
I know it exists, but I thought he played on a MacBook. (1280x800)
NO!!!
I play on a nearly 2 year old powerbook. (Read: Before Apple switched from IBM to Intel)
-
current svn uses 640x480 for the ui
this makes the ui look crappy. for example you are playing with res. 1024x768 -> all the menus are scaled by 1024.0/640.0 and converted to int
this means a precision lost (also the images)
ie it ruins the ui if you are using a resolution higher than 640x480
Should I bother to quote the UI code just to prove you wrong or will everybody believe me that you have no idea how the UI code works and you just make stupid posts?
A little hint: OpenGL doesn't care about current resolution.
prove it then you map 640x480 points to 1024x768 points this is not onto Thus you get holes
OK.
First some theory about floting point numbers. IEEE854 standard 32bit float ensures precision of 6 valid decimal digits. That means that vector coordinates scaled from 640x480 to 1024x768 have at least two valid digits after decimal point. That's precision of 1/100 of pixel.
And now some code:
void UI_AdjustFrom640( float *x, float *y, float *w, float *h ) {
// expect valid pointers
*x *= uiInfo.uiDC.xscale;
*y *= uiInfo.uiDC.yscale;
*w *= uiInfo.uiDC.xscale;
*h *= uiInfo.uiDC.yscale;
}
This function takes care of rescaling UI primitives. As you can see, x, y, w and h are all float pointers which need to be rescaled. xscale and yscale are also floats set to the right scaling values, eg. xscale = 1024/640.0; yscale = 768/480.0 No precision lost here.
All calls of this function are immediately followed by syscall of rendering functions. Again, all syscall arguments are passed as floats. You can check this in src/ui/ui_syscalls.c
Syscall handlers then take the arguments, still floats, and generate rendering command. Again, rendering command coordinates are float (x, y, w, h):
typedef struct {
int commandId;
shader_t *shader;
float x, y;
float w, h;
float s1, t1;
float s2, t2;
} stretchPicCommand_t;
When renderer backend decides it's time to render a new frame, it copies all rendering commands to the tesselator. x, y, w and h coordinates are copied to tess.xyz[][] array, which is again array of floats.
typedef float vec_t;
typedef vec_t vec4_t[4];
typedef struct shaderCommands_s
{
glIndex_t indexes[SHADER_MAX_INDEXES] ALIGN(16);
vec4_t xyz[SHADER_MAX_VERTEXES] ALIGN(16);
vec4_t normal[SHADER_MAX_VERTEXES] ALIGN(16);
vec2_t texCoords[SHADER_MAX_VERTEXES][2] ALIGN(16);
color4ub_t vertexColors[SHADER_MAX_VERTEXES] ALIGN(16);
...
} shaderCommands_t;
extern shaderCommands_t tess;
Again, no precision loss.
The last step in Q3 renderer core is dumping all commands in tesselator right into GPU. Again, the OpenGL functions used for that take float arguments. Q3 engine doesn't touch textures during rendering, they're all unchanged in memory since they've been loaded. All texturing is done in GPU, Q3 renderer just says which textures to use.
So where's your int conversion precision loss?
Resizing UI from virtual 640x480 to any other resolution would take weeks and all you would get are bigger numbers in menu definition files. The result would look exactly the same.
-
current svn uses 640x480 for the ui
this makes the ui look crappy. for example you are playing with res. 1024x768 -> all the menus are scaled by 1024.0/640.0 and converted to int
this means a precision lost (also the images)
ie it ruins the ui if you are using a resolution higher than 640x480
Should I bother to quote the UI code just to prove you wrong or will everybody believe me that you have no idea how the UI code works and you just make stupid posts?
A little hint: OpenGL doesn't care about current resolution.
prove it then you map 640x480 points to 1024x768 points this is not onto Thus you get holes
OK.
First some theory about floting point numbers. IEEE854 standard 32bit float ensures precision of 6 valid decimal digits. That means that vector coordinates scaled from 640x480 to 1024x768 have at least two valid digits after decimal point. That's precision of 1/100 of pixel.
And now some code:
void UI_AdjustFrom640( float *x, float *y, float *w, float *h ) {
// expect valid pointers
*x *= uiInfo.uiDC.xscale;
*y *= uiInfo.uiDC.yscale;
*w *= uiInfo.uiDC.xscale;
*h *= uiInfo.uiDC.yscale;
}
This function takes care of rescaling UI primitives. As you can see, x, y, w and h are all float pointers which need to be rescaled. xscale and yscale are also floats set to the right scaling values, eg. xscale = 1024/640.0; yscale = 768/480.0 No precision lost here.
All calls of this function are immediately followed by syscall of rendering functions. Again, all syscall arguments are passed as floats. You can check this in src/ui/ui_syscalls.c
Syscall handlers then take the arguments, still floats, and generate rendering command. Again, rendering command coordinates are float (x, y, w, h):
typedef struct {
int commandId;
shader_t *shader;
float x, y;
float w, h;
float s1, t1;
float s2, t2;
} stretchPicCommand_t;
When renderer backend decides it's time to render a new frame, it copies all rendering commands to the tesselator. x, y, w and h coordinates are copied to tess.xyz[][] array, which is again array of floats.
typedef float vec_t;
typedef vec_t vec4_t[4];
typedef struct shaderCommands_s
{
glIndex_t indexes[SHADER_MAX_INDEXES] ALIGN(16);
vec4_t xyz[SHADER_MAX_VERTEXES] ALIGN(16);
vec4_t normal[SHADER_MAX_VERTEXES] ALIGN(16);
vec2_t texCoords[SHADER_MAX_VERTEXES][2] ALIGN(16);
color4ub_t vertexColors[SHADER_MAX_VERTEXES] ALIGN(16);
...
} shaderCommands_t;
extern shaderCommands_t tess;
Again, no precision loss.
The last step in Q3 renderer core is dumping all commands in tesselator right into GPU. Again, the OpenGL functions used for that take float arguments. Q3 engine doesn't touch textures during rendering, they're all unchanged in memory since they've been loaded. All texturing is done in GPU, Q3 renderer just says which textures to use.
So where's your int conversion precision loss?
Resizing UI from virtual 640x480 to any other resolution would take weeks and all you would get are bigger numbers in menu definition files. The result would look exactly the same.
however it does not. and stop pasting stuff I already read all the ui code and the related renderer code. practice shows that when you scale all menu files to 1024x768 and virtual ui space to the same resolution, it looks better than the one scaled up from 640x480
and you know that every single pixel in 3d space must be mapped to your screen ie 2d space, but your screen does not use floats but ints. here is your precision lost. also some loss of precision in menu positioning. no need to mention hud stretching on non 4:3 resolutions
-
however it does not. and stop pasting stuff I already read all the ui code and the related renderer code. practice shows that when you scale all menu files to 1024x768 and virtual ui space to the same resolution, it looks better than the one scaled up from 640x480
gareth asked you to post screenshots. I'm waiting for results of your "practice" as well.
and you know that every single pixel in 3d space must be mapped to your screen ie 2d space, but your screen does not use floats but ints. here is your precision lost. also some loss of precision in menu positioning. no need to mention hud stretching on non 4:3 resolutions
So? One pixel is so small it doesn't matter. There's widescreen bias to fix non 4:3 resolutions. Are you going to rescale all menus to all possible resolutions or will you "fix" one resolution by "breaking" the other? There's no point in fixing what's not broken in the first place.
-
http://[img]http://img165.imageshack.us/img165/2264/compqh9.png[/img]
left one: it is the screenshot of main menu defined by 640x480 and screenshot taken at 1024x768
right one: it is the screenshot of main menu defined by 1024x768 and screenshot taken at 1024x768
both scaled down to show the dramatical difference
-
Scaled down pictures don't prove anything because all the detail is lost, especially when you use crappy software to scale them down. Post them in full detail.
And I think you lie about resolutions of those pictures because the polygon dretch model would be the same if the resolutions were the same. I guess the right screenshot is taken at 640x480 because the dretch has visibly sharper edges than the left one. Scaling down from higher resolution makes edges softer.
-
if you dont believe me try it yourself. I wont bother with you anymore
-
Seriously, 640x480 is a good default resolution, of which you can be pretty certain it will work on just about any computer.
But really, without sounding too harsh;
Who gives a fuck, really?
-
You can't really have an accurate poll if some of the respondents fall into two categories. "Higher than 1280x960" or "non 4:3," which one shall I choose?
-
Seriously, 640x480 is a good default resolution, of which you can be pretty certain it will work on just about any computer.
But really, without sounding too harsh;
Who gives a fuck, really?
that actually isn't resolution, it's a coordinate system. opengl doesn't care if you give it a coordinate system of 4x3,640x480 or 1600x1200, the resulting output is the same.
-
But I'm too stupid to understand that. I play Trem at a 640x480 resolution, please let me believe this.
-
"]You can't really have an accurate poll if some of the respondents fall into two categories. "Higher than 1280x960" or "non 4:3," which one shall I choose?
I'd vote for non 4:3 and assume all other options imply 4:3 resolution even if they don't say so explicitly.