It's been quite a while since oldunreal had an overhaul, but we are moving to another server which require some updates and changes. The biggest change is the migration of our old reliable YaBB forum to phpBB. This system expects you to login with your username and old password known from YaBB.
If you experience any problems there is also the usual "password forgotten" function. Don't forget to clear your browser cache!
If you have any further concerns feel free to contact me: Smirftsch@oldunreal.com
Original titel was "On Gamma handling in Unreal.", but I now changed it to more accurately represent the topic.
For a bit of background information you might want to read this article first:
Unreal content and rendering was initially designed for *not* performing any gamma correction on output, instead the gamma correction was preapplied to the texture resources themself. The table below shows a rough estimate on the gamma values used on the textures.
Editor Actor Sprites2.2
Most certainly they descided to render this incorrect way at the expense of poor lighting, to avoid heavy artifacts which would otherwise be introduced by limitations of that time (Thief 1/2 is a good example for the "correct" way around that time, which introduced heavy artifacts). Everything in rendering was build around and/or counter issues with the gamma incorrect rendering. Also in case of mesh skins, they used some heavy contrast enrichment (sigmoidal is my current educated guess).
Probably the best RenDev and version to judge how unreal intially looked like is Software Rendering on version 200. It shows some very well balanced lightning.
I made some tests to mimic the rendering behaviour of the 200er SoftDrv version, if you have Unreal 226b installed and want to give it a try: http://coding.hanfling.de/NoGamma226b-20150328.zip (If there is some interesst I can make an also 227i build of it, but it's mostly a demo for surface lighting, not intended for productive use).
Otherwise here are some screenshots:
Note that the more saturated look by GlideDrv was caused by a combination of a slight output gamma "correction" (1.25 by default) and probably the pyramide scaling which was neither linear nor quadratic. Maybe even it didn't handle color saturation at all correct or consistent.
So much for the nostalgic how it was part. The really complicated things start when one wants to turn the gamma incorrect rendering in some gamma correct rendering to improve the odd lighting. Basically it involves three parts: Using correct gamma correction on output (based on monitor!), adjusting textures to contain or to be loaded as linear data (gamma correcton, removing the extreme contrast enhencements on mesh textures, etc.), and the lighting itsself.
While the first part is straight forward, the second part about adjusting resources will become much of work and also include to develop a toolchain for ease of converting existing resources (I currently have something in development for this task).
The really complicated part starts regarding the lighting. It basically splits into two issues. One needs to adjust the overal light levels, which is probably creating another drop off function for the light data itsself, but the more problem point is to keep the color saturation levels as this is heavily influenced by this adjustment.
So much for now.
Now a bit more detail on about what I have in mind for batch correcting texture resources:
The biggest issue by itsself is to rebuild texture packages. Though one can do this by end by reimporting into the Editor by hand, this method involves a lot of unnecessary work, is unflexible and prone to errors. I currently have a successor to the make commandlet in development, which also contains the feature to auto include resources. Lets say you place a texture file under YourPackageSrc/Textures it will get automatically imported on package rebuild in the same way as unreal script source under YourPackageSrc/Classes would be. If you put the texture instead in a subdirectory like YourPackageSrc/Textures/MyGroup it will automatically be imported into the groups dir. For applying custom options like DrawScale, Flags, etc. I reuse the *.upkg file, previously just used to apply package flags, where you can set options on a per package, per group, per resource level. In fact it already supports more options the the #exec lines for texture import in unrealscript.
The *.upkg file for DeusEx CoreTexDetail package basically looks like this:
Code: Select all
[Package] Directory=Textures Extension=utx ; Enable Mips and set DrawScale=0.25 as default for the Detail group. [Detail] TextureMips=True TextureDrawScale=0.25 ; Override DScanline drawscale, so it uses it's old 0.5 DrawScale instead of the Group defaults 0.25. [Detail.DScanline] TextureDrawScale=0.5
Code: Select all
[Detail.DCracks_A] TextureDrawScale=0.250000 [Detail.DFabric_A] TextureDrawScale=0.250000 [Detail.DFabric_B] TextureDrawScale=0.250000 [Detail.DGouges_A] TextureDrawScale=0.250000 [Detail.DMetal_A] TextureDrawScale=0.250000 [Detail.DMetal_B] TextureDrawScale=0.250000 [Detail.DPitted_A] TextureDrawScale=0.250000 [Detail.DPitted_B] TextureDrawScale=0.250000 [Detail.DPitted_C] TextureDrawScale=0.250000 [Detail.DScanline] TextureDrawScale=0.500000 [Detail.DScratchs_A] TextureDrawScale=0.250000 [Detail.DStone_A] TextureDrawScale=0.250000 [Detail.DStone_B] TextureDrawScale=0.250000 [Detail.DStone_C] TextureDrawScale=0.250000 [Detail.DStone_D] TextureDrawScale=0.250000 [Detail.DWoodFine_A] TextureDrawScale=0.250000 [Detail.DWoodRuff_A] TextureDrawScale=0.250000
For integrating color adjustment operations (like gamma) I intend to add support for other options as TextureGamma, so for batchconverting a single package with a single gamma one would basically use some *.upkg header like:
Code: Select all
I'm really exited about doing this for stock Unreal and Nerf textures myself, but I really do want to do this one a high quality monitor for best quality and not baking deficits of my monitor into the texture packages themself. Expecially I'm exicted about figuring out which contrast enhencement option they used for the mesh skins, so I can implement it myself for build commandlet (still my educated guess is that it's a sigmoidal contrast enhencement).
As a for now final note:
This page has a nice example on how wrong gamma handling results in poor lighting in the scene, and another examples how this change require changes on the light attenuation itsself.
I came up with a light attenuation function for linear rendering which reessembles the overall behaviour and especially the light levels of the old weired distance attenuation code without the dimming issue when objects are too close to lightsources.
The basic form is:
Code: Select all
Additionally these parameters needs to satisfy the following relation as otherwise smoothstep is undefined:
Code: Select all
a*(LightRadius+b) < u*(LightRadius+v)
I plotted some comparision for the effect of a dead on angle on a white texture while approximating the effect of the "old" ~sRGB rendering with a gamma of 2.2:
One motivation for prefering the smoothstep approach over some "physically more correct" 1/r^2 attentuation for point light source is too think of LightRadius also beeing a parameter controlling the physical extend of the light source itsself. If you are "close" to an extended light source, the light source itsself will appear flat and infinitly extended and thus the light intensity stays approximatly constant. This is basically the same geometric consideration as you might have heared in school for a horseshoe magnet or a parallel plate capacitor.
In other news, I made some progress on the toolchain front too. I wrote an UExporter for exporting bitmap fonts with correctly rebuild borders around the glyphs and an UExporter an a UFactory for exporting UPalettes to ACT (Adobe Color Table) files. Sadly ACT is just 24 bit, but this should be sufficient nontheless.
The idea behind the UPalettes UFactory/UExporter is that one hand I cannot use P8 textures for storing sufficient quality mipmaps, but they are still needed for the palette based light effects. Also this is a step towards exporting/reimporting fractal textures, where for most of them the palette is crucial. My plan for exporting/reimporting fractal textures is basically to use some sort of an ini file to store the properties of the fractal texture and it's sparks.
I conclude for now with some screenshots showing the effect of scaling the light based on the Brightness setting rather then messing with the gamma ramp to control the brightness. The screenshots are done using linear rendering, but the results are similar to those one gets when using the old non linear rendering.
Currently I can take sRGB/Linear (sRGB)/AppleRGB/AdobeRGB data, optionally convert it to Linear (sRGB) data for mipmap filtering and output it as either Linear (sRGB) or RGB (e.g. what hardware natively supports) and output as RGBA.
Mipmap generation is currently only supports a simple Box filter, but is done in floating point.
It certainly helps to not have the palettized mipmaps anymore, and the linear filtering of the mipmaps certainly reduces odd darkening in the distance effect so prominent in Deus Ex.
The nice thing about this initial (small) implementation is that I can easily expand it. One of the most interessting things is certainly to support mipmap generation in CIELAB/CIELUV space, an option for biasing alpha channel mipmap generation for masked textures, supporting non box filtering, etc.
Another thing I should probably start soon is to build in is support for texture compression, there is a header for S3TC compression in the PubSrc's and the lib for it is found inside the Deus Ex SDK, which would be a quick start for it. Probably not the best solution, but it would be the same compressor used as before when importing 24 bit textures, and I would rather quick make up for the currently missing 24 bit -> S3TC import step when using build commandlet (though one could still use the old style #exec lines to get it).
In any case this would be a good start towards support for RGTC and BPTC, which is certainly high on my priority list as the old S3TC compressed formats are certainly more than troublesame for storing normals, etc.
However, what is sort of odd is that the buildin BMP/PCX factories/expoter treat TEXF_RGBA8 as BGRA data, and I'm not sure whether I will write replacements or wrapper around those which will handle conversation to RGBA data, or really complety treat TEXF_RGBA8 as BGRA data. I tend to make it 'real' RGBA data, as this would be in the end less confusing and I don't need to pull this BGRA through all custom code I still do want to write to support additional formats.
Also I read this article, though about PBR, stating that artists tend to make diffuse textures to dark, which causes issues for shading. And as textures in Deus Ex are really dark, in Unreal rather dark, while in Nerf they are in a rather sane bright level and things for linearer rendering turned out to work for Nerf best, to some degree in Unreal and not at all in Deus Ex i gave it a shot to brighten some of the textures in Deus Ex, and the rendering certainly did improve noticable. So looks like I found another reason why things in Deus Ex didnt work out so far.
Though I probably posted this before, the motivation is based on this [url=http://staff.fh-hagenberg.at/burger/publications/pdf/aapr2010.pdf]paper[/url] by Burger pointing out that performing linear filtering in neither sRGB nor linearized sRGB is actually a good choice while at the same time suggesting to use CIELAB or CIELUV.
a) Mips generated in sRGB space, filtered in sRGB space ingame
b) Mips generated in linear (sRGB) space, filter in linear (sRGB) space ingame.
c) Mips generated in CIELAB, filtered in linear (sRGB) space ingame.
d) Mips generated in CIELUV, filtered in linear (sRGB) space ingame.
I haven't done any excessive testing yet, but wanted to give some road with a white stripe a test as this usually always ends up in some bloating of the white stripe after anisotropic filtering gives up (I guess my driver settings enforced 8x anisotropic filtering for this shot).
The differences are certainly just minor, but imho the white bloating is slighty reduced when generating the mipmaps in CIELAB or CIELUV space, so I'm curious to try out if storing the texture data in CIELAB to make the GPU filter it linear in CIELAB space, and afterwards convert it to (linearized) sRGB inside the fragment shader will further reduce this effect *and* whether this will hopefully provide more pleasant texture magnification.
One further note towards a), which doesn't seem far off towards b). Especially in case the diffuse textures are rather dark like in Deus Ex, a) will cause a clearly noticable and odd darkening in the distance effect, while b) works fine.
However, CIELUV is a bitch. Take a look at the CIEXYZCIELUV transformations on Bruce Lindblooms really helpful site:
For XYZ -> LUV: In case of (X+15Y+3Z)==0 you get divide by zero errors for u' and v'. My current workaround is to set LUV=(0,0,0) in case (X+15Y+3Z) XYZ: In case of (u+14*L*u0)==0 and/or (v+14*L*v0)==0 you get a divide by zero for a and/or d respectivly. My (not bulletproof) current solution is to check whether L
I'll go back over my other textures and re-export them with the colour spaces you've mentioned both in this thread and IRC.
Convertion to Apple RGB is done absolute colormetric, in case of ROMM I used Photoshop (with afaik) the same setting. However ROMM uses the D50 illuminant instead of the D65 as the other two are using, so there is plenty of room for experimenting with other whitepoint adaption schemes.
Basically it's a quadratic falloff, enveloped to not explode at the origin and fade to zero once it hits the LightRadius limits.
One certainly favorable property of it is that it's brightness/how much light it gives of is basically controlled by the brightness of the light source and not the LightRadius property unless you make that too small. So LightRadius ends up beeing in the end just for culling/trimming amount of light sources affecting an area down.
However, first of all i needed to be able to put really bright light sources in in the first place, so I settled with adding another LightExponent variable to Actor which is used to calculate an additional scaling factor. As default value i picked 127.
Calculation is just a simple:[codeFactor = 2^((LightExponent - (LightEffect==LE_Standard ? 0 : 127))/8)[/code]
And yes, LE_Standard starts about 60k as intense.
Basically it works like this in the end (using another Factor calculation yielding to low values):
a) Max of 255 in the screenshot just did a 256 factor, now it's 64k.
b) Just clamping color values component wise on saturation sucks.
c) having no specular sucks, as otherwise this would bloom more towards white
d) Unlit sucks.
So first try in some box:
Thats where I noticed that sth. was wrong as it was hardly visible at all with LightExponent=255 and LightBrightness=255 and I needed to adjust my Factor calculation for the LightExponent.
Another shot, but with textures this time in Nerf, where I before deleted all other light sources:
Afterwards I went ahead and started some testing in first Deus Ex map by replacing the large lamp posts with these kind of light sources.
Single one only:
At the back of the statue (upper segmets of statue still using 'old' light souces):
I certainly do like the last shot, and I'm positive suprised which plausible lighting this attenuation function gives for just lambertian diffuse. In the future I'll probably experiment a bit with exposing the constant in the denominator to be configurable by the user as stated in the pdf. Also I was beeing able to get more plausible lighting with just 1/4 of the original amount of light sources, which is certainly a positive side effect, considering that at some point I do want to do realtime shadows for some light sources.
In any case, while doing this/trying to add a bit of ZoneAmbient it really heavily felt like running in the limitations of Deus Ex textures, that is, their brightness is (unrealistic) different and things like the bricks on the wall textures have way too much shadow baked in. :/
Good job, keep up the good work Han.
It is worth making sure you installed the monitor with the inf and ICC files from the manufacturer.
If you don't have the install files, or want to examine or customise/override your current monitor config and colour system, these free tools will help.
These tools may also unlock hidden speeds and resolutions outside the PnP standards, that would be listed in a missing inf file.
Very cool thread 8-)
https://www.oldunreal.com/cgi-bin/yabb2 ... 1578440909
Basically, using the cwdohnal OGLR I can't seem to adjust the gamma or brightness at all.
Code: Select all
FLOAT LOverR = DistanceToLightSource/WorldLightRadius; FLOAT Attenuation = 2.0f*(LOverR*LOverR*LOverR)-3.0f*(LOverR*LOverR)+1.0f;
Code: Select all
FLOAT Attenuation = FSmoothstep( 0.0f, 1.0f, 1.0f-DistanceToLightSource/WorldLightRadius );
The dotted line is the pure cubic hermite interpolation attenuation, the solid line is the legacy surface attenuation function. Looks like a pretty good fix to me. I still have no idea why I didn't do it this way the last time...
- Similar Topics
- Last post
OpenGL renderer for Unreal and Unreal: Return to Napali v2.26Final fps limiter
- 3 Replies
- 1041 Views
Last post by shoober420
My tribute to Unreal's 20th birthday - ATTILA - An Unreal singleplayer map with alpha content
- 1 Replies
- 767 Views
Last post by Buster
- 12 Replies
- 601 Views
Last post by subzer0
UT/Unreal Server Query Discord Bot (NodeJS) 2.3 Added Unreal Support
- 0 Replies
- 256 Views
Last post by Ooper