logo
Main

Forums

Downloads

Unreal-Netiquette

Donate for Oldunreal:
Donate

borderline

Links to our wiki:
Wiki

Walkthrough

Links

Tutorials

Unreal Reference

Usermaps

borderline

Contact us:
Submit News
Page Index Toggle Pages: 1 Send TopicPrint
Hot Topic (More than 10 Replies) On Gamma handling in Unreal. (Read 2465 times)
han
Global Moderator
Unreal Rendering Guru
Developer Team
*****
Offline


Oldunreal member

Posts: 516
Location: Germany
Joined: Dec 10th, 2014
Gender: Male
On Gamma handling in Unreal.
Mar 28th, 2016 at 8:56am
Print Post  
For a bit of background information you might want to read this article first:
http://gamedevelopment.tutsplus.com/articles/gamma-correction-and-why-it-matters...

Unreal content and rendering was initially designed for *not* performing any gamma correction on output, instead the gamma correction was preapplied to the texture resources themself. The table below shows a rough estimate on the gamma values used on the textures.
Surfaces 1.6-1.9
Meshes 2.0
Tiles/Sprites 1.65
Editor Actor Sprites 2.2

Most certainly they descided to render this incorrect way at the expense of poor lighting, to avoid heavy artifacts which would otherwise be introduced by limitations of that time (Thief 1/2 is a good example for the "correct" way around that time, which introduced heavy artifacts). Everything in rendering was build around and/or counter issues with the gamma incorrect rendering. Also in case of mesh skins, they used some heavy contrast enrichment (sigmoidal is my current educated guess).

Probably the best RenDev and version to judge how unreal intially looked like is Software Rendering on version 200. It shows some very well balanced lightning.
I made some tests to mimic the rendering behaviour of the 200er SoftDrv version, if you have Unreal 226b installed and want to give it a try: http://coding.hanfling.de/NoGamma226b-20150328.zip (If there is some interesst I can make an also 227i build of it, but it's mostly a demo for surface lighting, not intended for productive use).

Otherwise here are some screenshots:








Note that the more saturated look by GlideDrv was caused by a combination of a slight output gamma "correction" (1.25 by default) and probably the pyramide scaling which was neither linear nor quadratic. Maybe even it didn't handle color saturation at all correct or consistent.

So much for the nostalgic how it was part. The really complicated things start when one wants to turn the gamma incorrect rendering in some gamma correct rendering to improve the odd lighting. Basically it involves three parts: Using correct gamma correction on output (based on monitor!), adjusting textures to contain or to be loaded as linear data (gamma correcton, removing the extreme contrast enhencements on mesh textures, etc.), and the lighting itsself.

While the first part is straight forward, the second part about adjusting resources will become much of work and also include to develop a toolchain for ease of converting existing resources (I currently have something in development for this task).

The really complicated part starts regarding the lighting. It basically splits into two issues. One needs to adjust the overal light levels, which is probably creating another drop off function for the light data itsself, but the more problem point is to keep the color saturation levels as this is heavily influenced by this adjustment.

So much for now.
  

HX on Mod DB. Revision on Steam.
Back to top
 
IP Logged
 
han
Global Moderator
Unreal Rendering Guru
Developer Team
*****
Offline


Oldunreal member

Posts: 516
Location: Germany
Joined: Dec 10th, 2014
Gender: Male
Re: On Gamma handling in Unreal.
Reply #1 - Mar 29th, 2016 at 7:34am
Print Post  
Nerf Arena Blast (unlike other later UE1 games) looks like it has also been developed under the presumption that no output gamma was applied.

Screenshots:










Now a bit more detail on about what I have in mind for batch correcting texture resources:
The biggest issue by itsself is to rebuild texture packages. Though one can do this by end by reimporting into the Editor by hand, this method involves a lot of unnecessary work, is unflexible and prone to errors. I currently have a successor to the make commandlet in development, which also contains the feature to auto include resources. Lets say you place a texture file under YourPackageSrc/Textures it will get automatically imported on package rebuild in the same way as unreal script source under YourPackageSrc/Classes would be. If you put the texture instead in a subdirectory like YourPackageSrc/Textures/MyGroup it will automatically be imported into the groups dir. For applying custom options like DrawScale, Flags, etc. I reuse the *.upkg file, previously just used to apply package flags, where you can set options on a per package, per group, per resource level. In fact it already supports more options the the #exec lines for texture import in unrealscript.

The *.upkg file for DeusEx CoreTexDetail package basically looks like this:
Code
Select All
[Package]
Directory=Textures
Extension=utx

; Enable Mips and set DrawScale=0.25 as default for the Detail group.
[Detail]
TextureMips=True
TextureDrawScale=0.25

; Override DScanline drawscale, so it uses it's old 0.5 DrawScale instead of the Group defaults 0.25.
[Detail.DScanline]
TextureDrawScale=0.5
 


The other crucial commandlet in the toolchain is the rebuildimports commandlet, which offers an -upkg option to autogenerate the *.upkg file of an existing package. Due to it's nature initially designed for recreating #exec lines for packages with stripped source (BrotherBear *wink *wink), it currently supports no automatic grouping feature, so the autogenerated *.upkg file for this package -- while beeing functionally equivalent -- looks like this:
Code
Select All
[Detail.DCracks_A]
TextureDrawScale=0.250000

[Detail.DFabric_A]
TextureDrawScale=0.250000

[Detail.DFabric_B]
TextureDrawScale=0.250000

[Detail.DGouges_A]
TextureDrawScale=0.250000

[Detail.DMetal_A]
TextureDrawScale=0.250000

[Detail.DMetal_B]
TextureDrawScale=0.250000

[Detail.DPitted_A]
TextureDrawScale=0.250000

[Detail.DPitted_B]
TextureDrawScale=0.250000

[Detail.DPitted_C]
TextureDrawScale=0.250000

[Detail.DScanline]
TextureDrawScale=0.500000

[Detail.DScratchs_A]
TextureDrawScale=0.250000

[Detail.DStone_A]
TextureDrawScale=0.250000

[Detail.DStone_B]
TextureDrawScale=0.250000

[Detail.DStone_C]
TextureDrawScale=0.250000

[Detail.DStone_D]
TextureDrawScale=0.250000

[Detail.DWoodFine_A]
TextureDrawScale=0.250000

[Detail.DWoodRuff_A]
TextureDrawScale=0.250000
 



The textures themself can be extracted as usual using the batchexport commandlet.

For integrating color adjustment operations (like gamma) I intend to add support for other options as TextureGamma, so for batchconverting a single package with a single gamma one would basically use some *.upkg header like:
Code
Select All
[Package]
TextureGamma=1.6
 



For best quality however (and pushing quality above what is in there) one should evaluate this on a per texture basis.
I'm really exited about doing this for stock Unreal and Nerf textures myself, but I really do want to do this one a high quality monitor for best quality and not baking deficits of my monitor into the texture packages themself. Expecially I'm exicted about figuring out which contrast enhencement option they used for the mesh skins, so I can implement it myself for build commandlet (still my educated guess is that it's a sigmoidal contrast enhencement).

As a for now final note:
http://www.learnopengl.com/#!Advanced-Lighting/Gamma-Correction
This page has a nice example on how wrong gamma handling results in poor lighting in the scene, and another examples how this change require changes on the light attenuation itsself.
  

HX on Mod DB. Revision on Steam.
Back to top
 
IP Logged
 
han
Global Moderator
Unreal Rendering Guru
Developer Team
*****
Offline


Oldunreal member

Posts: 516
Location: Germany
Joined: Dec 10th, 2014
Gender: Male
Re: On Gamma handling in Unreal.
Reply #2 - Apr 4th, 2016 at 2:57pm
Print Post  
Looks like I need to revise my previous findings slightly. The resources themself seem to be stored with a gamma of 2.2, but instead are using a different RGB color space. My current assumption is that most surface textures (or at least the ones I checked) are stored in  Adobe Wide Gamut RGB color space while most mesh textures appear to be stored in Adobe RGB color space. However, this requires a more detailed analysis, as maybe other similar color spaces might be involved. Any informations about tools they used for creating the textures might be helpful to create a candidate list for the color spaces.
  

HX on Mod DB. Revision on Steam.
Back to top
 
IP Logged
 
Carbon
Senior Member
****
Offline


Tinfoilus Illuminatus

Posts: 285
Location: Living Room
Joined: Jan 15th, 2013
Gender: Male
Re: On Gamma handling in Unreal.
Reply #3 - Apr 21st, 2016 at 10:50am
Print Post  
Though I don't quite get some of what you are doing, I do find this work quite compelling.  Cool
  
Back to top
 
IP Logged
 
han
Global Moderator
Unreal Rendering Guru
Developer Team
*****
Offline


Oldunreal member

Posts: 516
Location: Germany
Joined: Dec 10th, 2014
Gender: Male
Re: On Gamma handling in Unreal.
Reply #4 - Aug 5th, 2016 at 11:09pm
Print Post  
For Unreal it currently seems to work best to assume Apple RGB colorspace on most of the textures, while the textures in Nerf appear to be all in sRGB color space. So at least for Nerf getting the (solid) textures classified in terms of the right color space and generate mipmaps accordingly shouldn't be much work. For Unreal it will probably end up in beeing much of an on a per texture basis work, where I would need some decent higher quality display to get optimal results.

I came up with a light attenuation function for linear rendering which reessembles the overall behaviour and especially the light levels of the old weired distance attenuation code without the dimming issue when objects are too close to lightsources.

The basic form is:
Code
Select All
(1-smoothstep(a*(LightRadius+b),u*(LightRadius+v),Distance))^2; 


where smoothstep is as defined in the GLSL spec and the parameters a,b,u,v are in the ranges shown in the table below
Parameter: Range:
a 0-2
b 0-1
u 20-25
v 0-1

Additionally these parameters needs to satisfy the following relation as otherwise smoothstep is undefined:
Code
Select All
a*(LightRadius+b) < u*(LightRadius+v) 


The old attentuation function has its maximum at approximatly 2*(LightRadius+1) and fell below a 1/512 = ~0.002 threshould (e.g. will definetly end up as zero in the results) at approx 21*(LightRadius+1).

I plotted some comparision for the effect of a dead on angle on a white texture while approximating the effect of the "old" ~sRGB rendering with a gamma of 2.2:
http://coding.hanfling.de/UnrealSmoothStepAttenuation.pdf

One motivation for prefering the smoothstep approach over some "physically more correct" 1/r^2 attentuation for point light source is too think of LightRadius also beeing a parameter controlling the physical extend of the light source itsself. If you are "close" to an extended light source, the light source itsself will appear flat and infinitly extended and thus the light intensity stays approximatly constant. This is basically the same geometric consideration as you might have heared in school for a horseshoe magnet or a parallel plate capacitor.

In other news, I made some progress on the toolchain front too. I wrote an UExporter for exporting bitmap fonts with correctly rebuild borders around the glyphs and an UExporter an a UFactory for exporting UPalettes to ACT (Adobe Color Table) files. Sadly ACT is just 24 bit, but this should be sufficient nontheless.

The idea behind the UPalettes UFactory/UExporter is that one hand I cannot use P8 textures for storing sufficient quality mipmaps, but they are still needed for the palette based light effects. Also this is a step towards exporting/reimporting fractal textures, where for most of them the palette is crucial. My plan for exporting/reimporting fractal textures is basically to use some sort of an ini file to store the properties of the fractal texture and it's sparks.

I conclude for now with some screenshots showing the effect of scaling the light based on the Brightness setting rather then messing with the gamma ramp to control the brightness. The screenshots are done using linear rendering, but the results are similar to those one gets when using the old non linear rendering.









  

HX on Mod DB. Revision on Steam.
Back to top
 
IP Logged
 
han
Global Moderator
Unreal Rendering Guru
Developer Team
*****
Offline


Oldunreal member

Posts: 516
Location: Germany
Joined: Dec 10th, 2014
Gender: Male
Re: On Gamma handling in Unreal.
Reply #5 - Oct 12th, 2016 at 9:16pm
Print Post  
I finally came around to implement parts of the textures resampling code for my build commandlet.

Currently I can take sRGB/Linear (sRGB)/AppleRGB/AdobeRGB data, optionally convert it to Linear (sRGB) data for mipmap filtering and output it as either Linear (sRGB) or RGB (e.g. what hardware natively supports) and output as RGBA.

Mipmap generation is currently only supports a simple Box filter, but is done in floating point.

It certainly helps to not have the palettized mipmaps anymore, and the linear filtering of the mipmaps certainly reduces odd darkening in the distance effect so prominent in Deus Ex.

The nice thing about this initial (small) implementation is that I can easily expand it. One of the most interessting things is certainly to support mipmap generation in CIELAB/CIELUV space, an option for biasing alpha channel mipmap generation for masked textures, supporting non box filtering, etc. Smiley

Another thing I should probably start soon is to build in is support for texture compression, there is a header for S3TC compression in the PubSrc's and the lib for it is found inside the Deus Ex SDK, which would be a quick start for it. Probably not the best solution, but it would be the same compressor used as before when importing 24 bit textures, and I would rather quick make up for the currently missing 24 bit -> S3TC import step when using build commandlet (though one could still use the old style #exec lines to get it).
In any case this would be a good start towards support for RGTC and BPTC, which is certainly high on my priority list as the old S3TC compressed formats are certainly more than troublesame for storing normals, etc.

However, what is sort of odd is that the buildin BMP/PCX factories/expoter treat TEXF_RGBA8 as BGRA data, and I'm not sure whether I will write replacements or wrapper around those which will handle conversation to RGBA data, or really complety treat TEXF_RGBA8 as BGRA data. I tend to make it 'real' RGBA data, as this would be in the end less confusing and I don't need to pull this BGRA through all custom code I still do want to write to support additional formats.

Also I read this article, though about PBR, stating that artists tend to make diffuse textures to dark, which causes issues for shading. And as textures in Deus Ex are really dark, in Unreal rather dark, while in Nerf they are in a rather sane bright level and things for linearer rendering turned out to work for Nerf best, to some degree in Unreal and not at all in Deus Ex i gave it a shot to brighten some of the textures in Deus Ex, and the rendering certainly did improve noticable. So looks like I found another reason why things in Deus Ex didnt work out so far. Smiley
  

HX on Mod DB. Revision on Steam.
Back to top
 
IP Logged
 
han
Global Moderator
Unreal Rendering Guru
Developer Team
*****
Offline


Oldunreal member

Posts: 516
Location: Germany
Joined: Dec 10th, 2014
Gender: Male
Re: On Gamma handling in Unreal.
Reply #6 - Oct 13th, 2016 at 4:20pm
Print Post  
Alright, I implemented support for generating mipmaps in CIELAB and CIELUV space.

Though I probably posted this before, the motivation is based on this paper by Burger pointing out that performing linear filtering in neither sRGB nor linearized sRGB is actually a good choice while at the same time suggesting to use CIELAB or CIELUV.

Comparission:
http://coding.hanfling.de/MipsFilter.bmp

Legend:
a) Mips generated in sRGB space, filtered in sRGB space ingame
b) Mips generated in linear (sRGB) space, filter in linear (sRGB) space ingame.
c) Mips generated in CIELAB, filtered in linear (sRGB) space ingame.
d) Mips generated in CIELUV, filtered in linear (sRGB) space ingame.

I haven't done any excessive testing yet, but wanted to give some road with a white stripe a test as this usually always ends up in some bloating of the white stripe after anisotropic filtering gives up (I guess my driver settings enforced 8x anisotropic filtering for this shot).

The differences are certainly just minor, but imho the white bloating is slighty reduced when generating the mipmaps in CIELAB or CIELUV space, so I'm curious to try out if storing the texture data in CIELAB to make the GPU filter it linear in CIELAB space, and afterwards convert it to (linearized) sRGB inside the fragment shader will further reduce this effect *and* whether this will hopefully provide more pleasant texture magnification.

One further note towards a), which doesn't seem far off towards b). Especially in case the diffuse textures are rather dark like in Deus Ex, a) will cause a clearly noticable and odd darkening in the distance effect, while b) works fine.

However, CIELUV is a bitch. Take a look at the CIEXYZ<=>CIELUV transformations on Bruce Lindblooms really helpful site:
http://www.brucelindbloom.com/index.html?Eqn_XYZ_to_Luv.html
http://www.brucelindbloom.com/index.html?Eqn_Luv_to_XYZ.html

For XYZ -> LUV: In case of (X+15Y+3Z)==0 you get divide by zero errors for u' and v'. My current workaround is to set LUV=(0,0,0) in case (X+15Y+3Z)<SMALL_VALUE, however just L=0 makes sense in this case, for U and V this is more like some "what else" kind of choice. I think one approach for selecting UV in this particular case would be on some sort of what would provide the the most desireble results for a texture filtered towards a black hole.

For LUV -> XYZ: In case of (u+14*L*u0)==0 and/or (v+14*L*v0)==0 you get a divide by zero for a and/or d respectivly. My (not bulletproof) current solution is to check whether L<SMALL_NUMBER and -- as this is black -- set the XYZ values to (0,0,0). This seems to be sufficient for my two test textures so far. To make it bulletproof one could just clamp (L,U,V) into [0,1]x[0,1]x[0,1] and afterwards do the L<SMALL_NUMBER check. This is probably required for filters like Lanczos as values get weighted with a negative sign, so L could become negative. But I should actually add that straight ahead to be safe. However, I'll keep CIELUV for filtering unless it causes more troubles, recommend CIELAB over it and for textures storage and shader code I'll not consider it at all.

So much for now.. oh and hey.. looks like my ideal of Nerf beeing in sRGB starts to crack. For me a great amount of especially the player skins seem more right in AdobeRGB space compared to sRGB and other people I asked also can't really tell what would be more right, so at least the difference for Nerf doesnt seem that large...
  

HX on Mod DB. Revision on Steam.
Back to top
 
IP Logged
 
han
Global Moderator
Unreal Rendering Guru
Developer Team
*****
Offline


Oldunreal member

Posts: 516
Location: Germany
Joined: Dec 10th, 2014
Gender: Male
Re: On Gamma handling in Unreal.
Reply #7 - Mar 7th, 2017 at 1:26pm
Print Post  
The screenshots this thread certainly do showcase whats the advantage of linear rendering:
http://quest3d.com/forum/index.php?topic=64434.0
  

HX on Mod DB. Revision on Steam.
Back to top
 
IP Logged
 
Kajgue
Global Moderator
Betatester
*****
Offline


Super-sexy-Kung-Fu-H
obo-sunva-bitch

Posts: 334
Location: Apophizal (T:S:B) Headquarters
Joined: Oct 17th, 2005
Gender: Male
Re: On Gamma handling in Unreal.
Reply #8 - Mar 7th, 2017 at 2:22pm
Print Post  
Thanks for this information Hanfling, this is useful. *thumbs up*

I'll go back over my other textures and re-export them with the colour spaces you've mentioned both in this thread and IRC.
  

AKA - ( T : S : B ) Ice-Lizard


Whistleblower Ted Gunderson
Back to top
IP Logged
 
han
Global Moderator
Unreal Rendering Guru
Developer Team
*****
Offline


Oldunreal member

Posts: 516
Location: Germany
Joined: Dec 10th, 2014
Gender: Male
Re: On Gamma handling in Unreal.
Reply #9 - Jul 12th, 2017 at 6:37pm
Print Post  
Some shots of two particular textures in Deus Ex assuming they are in sRGB/AppleRGB/ROMM

http://coding.hanfling.de/Cargo/

Convertion to Apple RGB is done absolute colormetric, in case of ROMM I used Photoshop (with afaik) the same setting. However ROMM uses the D50 illuminant instead of the D65 as the other two are using, so there is plenty of room for experimenting with other whitepoint adaption schemes.
  

HX on Mod DB. Revision on Steam.
Back to top
 
IP Logged
 
han
Global Moderator
Unreal Rendering Guru
Developer Team
*****
Offline


Oldunreal member

Posts: 516
Location: Germany
Joined: Dec 10th, 2014
Gender: Male
Re: On Gamma handling in Unreal.
Reply #10 - Jul 12th, 2017 at 7:04pm
Print Post  
Oh and, about same scene as above on the nogamma shots, but this time rendered linear, with linear (box-) filtered mipmaps with texture input for most textures in the scene assumed as Adobe RGB. Roughly adjusted attenuation functions, lots of work left, etc. etc.


  

HX on Mod DB. Revision on Steam.
Back to top
 
IP Logged
 
han
Global Moderator
Unreal Rendering Guru
Developer Team
*****
Offline


Oldunreal member

Posts: 516
Location: Germany
Joined: Dec 10th, 2014
Gender: Male
Re: On Gamma handling in Unreal.
Reply #11 - Oct 6th, 2017 at 3:00am
Print Post  
Stepped a bit aside for the moment from dealing with how to translate the old attenuation functions to linear rendering, but rather try out some more sensible one i found here (p. 12). I think the name I'll went for will be LE_Standard, and currently I'm abusing LE_Unused for it.

Basically it's a quadratic falloff, enveloped to not explode at the origin and fade to zero once it hits the LightRadius limits.

One certainly favorable property of it is that it's brightness/how much light it gives of is basically controlled by the brightness of the light source and not the LightRadius property unless you make that too small. So LightRadius ends up beeing in the end just for culling/trimming amount of light sources affecting an area down.

However, first of all i needed to be able to put really bright light sources in in the first place, so I settled with adding another LightExponent variable to Actor which is used to calculate an additional scaling factor. As default value i picked 127.

Calculation is just a simple:[codeFactor = 2^((LightExponent - (LightEffect==LE_Standard ? 0 : 127))/8) [/code]
And yes, LE_Standard starts about 60k as intense.

Basically it works like this in the end (using another Factor calculation yielding to low values):

So...
a) Max of 255 in the screenshot just did a 256 factor, now it's 64k.
b) Just clamping color values component wise on saturation sucks.
c) having no specular sucks, as otherwise this would bloom more towards white
d) Unlit sucks.

So first try in some box:

Thats where I noticed that sth. was wrong as it was hardly visible at all with LightExponent=255 and LightBrightness=255 and I needed to adjust my Factor calculation for the LightExponent.

Another shot, but with textures this time in Nerf, where I before deleted all other light sources:


Afterwards I went ahead and started some testing in first Deus Ex map by replacing the large lamp posts with these kind of light sources.

Single one only:


At the back of the statue (upper segmets of statue still using 'old' light souces):


I certainly do like the last shot, and I'm positive suprised which plausible lighting this attenuation function gives for just lambertian diffuse. In the future I'll probably experiment a bit with exposing the constant in the denominator to be configurable by the user as stated in the pdf. Also I was beeing able to get more plausible lighting with just 1/4 of the original amount of light sources, which is certainly a positive side effect, considering that at some point I do want to do realtime shadows for some light sources.

In any case, while doing this/trying to add a bit of ZoneAmbient it really heavily felt like running in the limitations of Deus Ex textures, that is, their brightness is (unrealistic) different and things like the bricks on the wall textures have way too much shadow baked in. :/
  

HX on Mod DB. Revision on Steam.
Back to top
 
IP Logged
 
Kajgue
Global Moderator
Betatester
*****
Offline


Super-sexy-Kung-Fu-H
obo-sunva-bitch

Posts: 334
Location: Apophizal (T:S:B) Headquarters
Joined: Oct 17th, 2005
Gender: Male
Re: On Gamma handling in Unreal.
Reply #12 - Oct 6th, 2017 at 6:10pm
Print Post  
The quadratic simulated light definately looks very useful, especially simplifying what would currently be = place a bright, small radius light actor around the light source, and then make a more macro light which is emitted from the light source much thinner to cover the general area.

Good job, keep up the good work Han.
  

AKA - ( T : S : B ) Ice-Lizard


Whistleblower Ted Gunderson
Back to top
IP Logged
 
Page Index Toggle Pages: 1
Send TopicPrint
Bookmarks: del.icio.us Digg Facebook Google Google+ Linked in reddit StumbleUpon Twitter Yahoo