You've already forked UnrealEngineUWP
mirror of
https://github.com/izzy2lost/UnrealEngineUWP.git
synced 2026-03-26 18:15:20 -07:00
#lockdown Nick.Penwarden #rb none ========================== MAJOR FEATURES + CHANGES ========================== Change 3055495 on 2016/07/19 by Marc.Olano Allow Noise material node on mobile No reason to exclude mobile, except for Fast Gradient Noise, which uses 3D textures. Allow this node on ES2 for all of the other noise functions. #jira UE-33345 Change 3055602 on 2016/07/19 by Luke.Thatcher Fix crash bug in D3D11 RHI when selecting adapters. - Array of adapter descriptors will get out of sync with the adapter index if any adapter is skipped (e.g. the Microsoft Basic Render Device). #jira UE-33236 Change 3055890 on 2016/07/19 by Daniel.Wright Improved the assert in LoadModuleChecked so we won't have to check the log to see which module it was Change 3055891 on 2016/07/19 by Daniel.Wright Fixed Global Distance Field not dirtying previous object position on UpdateTransform - left behind a phantom shadow on teleports * This will effectively double partial distiance field update costs until clipping of the update regions is implemented Change 3055892 on 2016/07/19 by Daniel.Wright Higher poly light source shapes drawn into reflection captures Change 3055893 on 2016/07/19 by Daniel.Wright More info to 'Incompatible surface format' GNM assert Change 3055904 on 2016/07/19 by Daniel.Wright Reflection environment normalization improvements * Indirect specular from reflection captures is now mixed with indirect diffuse from lightmaps based on roughness, such that a mirror surface will have no mixing. Reflection captures now match other reflection methods like SSR and planar reflections much more closely. * When a stationary skylight is present, Reflection captures are now normalized as if the initial skylight will always be present, giving consistent results with static skylight reflections. The skylight and reflection captures with sky removed used to be normalized separately, compacting the relative brightness between the sky and scene. * Added r.ReflectionEnvironmentLightmapMixing for debugging lightmap mixing issues. This toggle was previously not possible due to prenormalizing the capture data. * The standard deferred reflection path (r.DoTiledReflections 0) can no longer match the results of the compute path or base pass reflections, as it would require MRT to accumulate the average brightness * Removed unused r.DiffuseFromCaptures * Cost of reflection environment on PS4 increased from 1.52ms -> 1.75ms with this change, but decreased back to 1.58ms by reducing tile size to 8x8 Change 3055905 on 2016/07/19 by Daniel.Wright Workaround for RTDF shadows not working on PS4 - manual clear of ObjectIndirectArguments instead of RHICmdList.ClearUAV Change 3059486 on 2016/07/21 by Nick.Penwarden Testing #uecritical Change 3060558 on 2016/07/21 by Daniel.Wright Fixed skylight with specified cubemap being black Change 3061999 on 2016/07/22 by Marcus.Wassmer Disable old AMD driver hacks for DX11. QA has already tested with them off and given thumbs up. Change 3062241 on 2016/07/22 by Daniel.Wright Fixed bug in RHISupportsSeparateMSAAAndResolveTextures that was preventing MSAA for any non-Vulkan platforms Change 3062244 on 2016/07/22 by Daniel.Wright Discard old prenormalized reflection environment data on load Change 3062283 on 2016/07/22 by Daniel.Wright MSAA support for the forward renderer * AntiAliasing method is chosen in Rendering project settings, DefaultSettings category * Deferred passes like shadow projection, fogging and decals are only computed per-pixel and can introduce aliasing * Added Rendering project setting VertexFoggingForOpaque, which makes height fog cheaper and work properly with MSAA * The AntiAliasing method in PostProcessSettings has been removed, this may affect existing content * Added r.MSAACount which defaults to 4 * Integrated wide custom resolve filter from Oculus renderer, controlled by r.WideCustomResolve * GBuffer targets are no longer allocated when using the forward renderer * Decal blend modes that write to the GBuffer fall back to SceneColor emissive only Change 3062666 on 2016/07/23 by Uriel.Doyon Added legend to streaming accuracy viewmodes Added a new helper class FRenderTargetTemp to be reused in different canvas rendering. Exposed the pass through pixel shader so that it can be reused. #review-3058986 @marcus.wassmer Change 3063023 on 2016/07/25 by Luke.Thatcher Fix "RecompileShaders Changed" when using Cook On The Fly. #jira UE-33573 Change 3063078 on 2016/07/25 by Ben.Woodhouse Add -emitdrawevents command line option to emit draw events by default. This is useful when capturing with Renderdoc Change 3063315 on 2016/07/25 by Ben.Woodhouse Fix div 0 in motion blur. This caused artifacts in some fairly common cases #jira UE-32331 Change 3063897 on 2016/07/25 by Uriel.Doyon Fixed missing qualifier on interpolants Change 3064559 on 2016/07/26 by Ben.Woodhouse Fix for cooker crash with BC6H textures (XB1, but may affect other platforms). Also fixes corruption issue with texture slices not being a multiple of 4 pixels (expanding as necessary), courtesy of Stu McKenna at the Coalition Tested fix on xbox, PC and PS4, using QAGame #jira UE-28592 Change 3064896 on 2016/07/26 by Ben.Woodhouse Fix compile errors on PS4 (the variable "sample" was conflicting with a keyword, causing compile errors). Also making encoding consistent on new shaders (ansi rather than UTF16) Change 3064913 on 2016/07/26 by Ben.Marsh Fix spelling of "Editor, Tools, Monolithics & DDC" node in Dev-Rendering build settings. Change 3065326 on 2016/07/26 by Uriel.Doyon Fixed UnbuiltInstanceBoundsList not being reset correctly, creating broken rendered primitives. #jira UE-32585 Change 3065541 on 2016/07/26 by Daniel.Wright Materials with a GBuffer SceneTexture lookup will fail to compile with forward shading Change 3065543 on 2016/07/26 by Daniel.Wright Restored DetailMode changes causing a FGlobalComponentRecreateRenderStateContext - accidental removal from cl 2969413 Change 3065545 on 2016/07/26 by Daniel.Wright Added material property bNormalCurvatureToRoughness, which can slightly reduce aliasing. Tweakable impact with r.NormalCurvatureToRoughnessScale. Fixed reflection capture feedback with base pass reflections Change 3066783 on 2016/07/27 by Daniel.Wright Moved PreShadowCacheDepthZ out of FSceneRenderTargets and into FScene, which fixes issues with cached preshadows and multiple scenes, including HighResScreenShot Disabled GMinScreenRadiusForShadowCaster on per-object shadows, which fixes popping when trying to increase shadow resolution from the defaults (r.Shadow.TexelsPerPixel 3) Change 3066794 on 2016/07/27 by Daniel.Wright Fixed crash rendering planar reflections due to NULL PostProcessSettings Change 3067412 on 2016/07/27 by Daniel.Wright Fix for OpenGL4 with uint interpolator Change 3068470 on 2016/07/28 by Daniel.Wright Fixed crash rendering translucency with translucent shadows which were determined to be invisible Change 3069046 on 2016/07/28 by Daniel.Wright Handle null Family in SetupAntiAliasingMethod Change 3069059 on 2016/07/28 by Daniel.Wright Added r.ReflectionEnvironmentBeginMixingRoughness (.1) and r.ReflectionEnvironmentEndMixingRoughness (.3), which can be used to tweak the lightmap mixing heuristc, or revert to previous behavior (mixing even on a mirror surface) Change 3069391 on 2016/07/28 by Daniel.Wright Fixed AverageBrightness being applied to reflections in gamma space in the mobile base pass, causing ES2 reflections to be overbright Change 3070369 on 2016/07/29 by Daniel.Wright r.ReflectionEnvironmentBeginMixingRoughness and r.ReflectionEnvironmentEndMixingRoughness set to 0 can be used to achieve old non-roughness based lightmap mixing Change 3070370 on 2016/07/29 by Daniel.Wright Bumped reflection capture DDC version to get rid of legacy prenormalized data Change 3070680 on 2016/07/29 by Marcus.Wassmer Fix slate ensure that is most likely a timing issue exposed by rendering. #ue-33902 Change 3070811 on 2016/07/29 by Marcus.Wassmer Fix ProjectLauncher errors when loading old versions #ue-33939 Change 3070971 on 2016/07/29 by Uriel.Doyon Updated ListTextures outputs to fix cooked VS non cooked differences and also to put enphasis on disk VS memory Change 3071452 on 2016/07/31 by Uriel.Doyon Updated the legend description for the (texture streaming) primitive distance accuracy view mode [CL 3072803 by Marcus Wassmer in Main branch]
1198 lines
48 KiB
Plaintext
1198 lines
48 KiB
Plaintext
// Copyright 1998-2016 Epic Games, Inc. All Rights Reserved.
|
|
|
|
/*=============================================================================
|
|
BasePassPixelShader.usf: Base pass pixel shader
|
|
=============================================================================*/
|
|
|
|
#include "Common.usf"
|
|
#include "SHCommon.usf"
|
|
#include "Material.usf"
|
|
#include "BasePassCommon.usf"
|
|
#include "VertexFactory.usf"
|
|
#include "LightmapCommon.usf"
|
|
#include "ReflectionEnvironmentShared.usf"
|
|
#include "PlanarReflectionShared.usf"
|
|
#include "BRDF.usf"
|
|
#include "Random.usf"
|
|
#include "LightAccumulator.usf"
|
|
#include "DeferredShadingCommon.usf"
|
|
#include "VelocityCommon.usf"
|
|
|
|
#define PREV_FRAME_COLOR 1
|
|
#include "ScreenSpaceRayCast.usf"
|
|
|
|
float NormalCurvatureToRoughness(float3 WorldNormal)
|
|
{
|
|
float3 dNdx = ddx(WorldNormal);
|
|
float3 dNdy = ddy(WorldNormal);
|
|
float x = dot(dNdx, dNdx);
|
|
float y = dot(dNdy, dNdy);
|
|
return max(x, y);
|
|
}
|
|
|
|
#if TRANSLUCENT_SELF_SHADOWING
|
|
|
|
#include "ShadowProjectionCommon.usf"
|
|
|
|
float4x4 WorldToShadowMatrix;
|
|
float4 ShadowUVMinMax;
|
|
float3 DirectionalLightDirection;
|
|
float4 DirectionalLightColor;
|
|
|
|
#endif
|
|
|
|
#if MATERIALBLENDING_ANY_TRANSLUCENT
|
|
// Downsampled translucency scale (eg 2.0 for r.SeparateTranslucencyScreenPercentage=50)
|
|
float DownsampleFactorFromSceneBufferSize;
|
|
#endif
|
|
|
|
#if MATERIAL_SHADINGMODEL_HAIR || SIMPLE_FORWARD_DIRECTIONAL_LIGHT
|
|
#include "ShadingModels.usf"
|
|
#endif
|
|
|
|
Texture2D HZBTexture;
|
|
SamplerState HZBSampler;
|
|
Texture2D PrevSceneColor;
|
|
SamplerState PrevSceneColorSampler;
|
|
|
|
#if PLATFORM_SUPPORTS_RENDERTARGET_WRITE_MASK && USE_DBUFFER && MATERIALDECALRESPONSEMASK && !MATERIALBLENDING_ANY_TRANSLUCENT
|
|
Texture2D<uint> DBufferMask;
|
|
#endif
|
|
|
|
#ifndef COMPILER_GLSL
|
|
#define COMPILER_GLSL 0
|
|
#endif
|
|
|
|
#define FORCE_FULLY_ROUGH (SIMPLE_FORWARD_SHADING || MATERIAL_FULLY_ROUGH)
|
|
#define EDITOR_ALPHA2COVERAGE (USE_EDITOR_COMPOSITING && FEATURE_LEVEL >= FEATURE_LEVEL_SM5 && !COMPILER_GLSL)
|
|
#define LIGHT_GRID_FORWARD_SHADING ((FORWARD_SHADING || TRANSLUCENCY_LIGHTING_SURFACE_PERPIXEL) && FEATURE_LEVEL >= FEATURE_LEVEL_SM5)
|
|
|
|
#if LIGHT_GRID_FORWARD_SHADING
|
|
#include "ForwardLightingCommon.usf"
|
|
#endif
|
|
|
|
#if TRANSLUCENCY_LIGHTING_SURFACE || TRANSLUCENCY_LIGHTING_SURFACE_PERPIXEL || FORWARD_SHADING
|
|
|
|
#if FEATURE_LEVEL >= FEATURE_LEVEL_SM5
|
|
/** Prenormalized capture of the scene that's closest to the object being rendered, used for reflection environment on translucency. */
|
|
TextureCubeArray ReflectionCubemap;
|
|
SamplerState ReflectionCubemapSampler;
|
|
int CubemapArrayIndex;
|
|
#else
|
|
TextureCube ReflectionCubemap;
|
|
SamplerState ReflectionCubemapSampler;
|
|
#endif
|
|
|
|
float4 ReflectionPositionAndRadius;
|
|
float ReflectionShape;
|
|
float4x4 BoxTransform;
|
|
float4 BoxScales;
|
|
float4 CaptureOffsetAndAverageBrightness;
|
|
|
|
uint MortonCode( uint x )
|
|
{
|
|
//x = (x ^ (x << 8)) & 0x00ff00ff;
|
|
//x = (x ^ (x << 4)) & 0x0f0f0f0f;
|
|
x = (x ^ (x << 2)) & 0x33333333;
|
|
x = (x ^ (x << 1)) & 0x55555555;
|
|
return x;
|
|
}
|
|
|
|
half3 GetImageBasedReflectionLighting(FMaterialPixelParameters MaterialParameters, half Roughness, half3 SpecularColor, half IndirectIrradiance)
|
|
{
|
|
float3 N = MaterialParameters.WorldNormal;
|
|
float3 V = MaterialParameters.CameraVector;
|
|
|
|
float3 RayDirection = 2 * dot( V, N ) * N - V;
|
|
half NoV = saturate(dot(N, V));
|
|
|
|
float4 SpecularIBL = float4(0, 0, 0, 0);
|
|
float3 CaptureVector = MaterialParameters.AbsoluteWorldPosition - ReflectionPositionAndRadius.xyz;
|
|
float CaptureVectorLength = length(CaptureVector);
|
|
|
|
float2 CompositedAverageBrightness = float2(0.0f, 1.0f);
|
|
|
|
BRANCH
|
|
if (CaptureVectorLength < ReflectionPositionAndRadius.w)
|
|
{
|
|
float NormalizedDistanceToCapture = saturate(CaptureVectorLength / ReflectionPositionAndRadius.w);
|
|
|
|
float DistanceAlpha;
|
|
float3 ProjectedCaptureVector;
|
|
if (ReflectionShape > 0.0f)
|
|
{
|
|
ProjectedCaptureVector = GetLookupVectorForBoxCapture(RayDirection, MaterialParameters.AbsoluteWorldPosition, ReflectionPositionAndRadius, BoxTransform, BoxScales, CaptureOffsetAndAverageBrightness.xyz, DistanceAlpha);
|
|
}
|
|
else
|
|
{
|
|
ProjectedCaptureVector = GetLookupVectorForSphereCapture(RayDirection, MaterialParameters.AbsoluteWorldPosition, ReflectionPositionAndRadius, NormalizedDistanceToCapture, CaptureOffsetAndAverageBrightness.xyz, DistanceAlpha);
|
|
}
|
|
|
|
half AbsoluteSpecularMip = ComputeReflectionCaptureMipFromRoughness(Roughness, View.ReflectionCubemapMaxMip);
|
|
|
|
#if FEATURE_LEVEL >= FEATURE_LEVEL_SM5
|
|
float4 Sample = TextureCubeArraySampleLevel(ReflectionCubemap, ReflectionCubemapSampler, ProjectedCaptureVector, CubemapArrayIndex, AbsoluteSpecularMip);
|
|
#else
|
|
float4 Sample = TextureCubeSampleLevel(ReflectionCubemap, ReflectionCubemapSampler, ProjectedCaptureVector, AbsoluteSpecularMip);
|
|
#endif
|
|
Sample *= DistanceAlpha;
|
|
SpecularIBL = Sample;
|
|
|
|
float AverageBrightness = CaptureOffsetAndAverageBrightness.w;
|
|
CompositedAverageBrightness.x += AverageBrightness * DistanceAlpha * CompositedAverageBrightness.y;
|
|
CompositedAverageBrightness.y *= 1 - DistanceAlpha;
|
|
}
|
|
|
|
float3 ExtraIndirectSpecular = 0;
|
|
|
|
#if ENABLE_SKY_LIGHT
|
|
BRANCH
|
|
if (SkyLightParameters.y > 0 && SpecularIBL.a < .999f)
|
|
{
|
|
float SkyAverageBrightness = 1.0f;
|
|
float3 SkyLighting = GetSkyLightReflection(RayDirection, Roughness, SkyAverageBrightness);
|
|
|
|
// Normalize for static skylight types which mix with lightmaps
|
|
bool bNormalize = SkyLightParameters.z < 1 && ALLOW_STATIC_LIGHTING;
|
|
|
|
FLATTEN
|
|
if (bNormalize)
|
|
{
|
|
// Add in sky wherever reflection captures don't have coverage
|
|
SpecularIBL.rgb += (1 - SpecularIBL.a) * SkyLighting;
|
|
CompositedAverageBrightness.x += SkyAverageBrightness * CompositedAverageBrightness.y;
|
|
}
|
|
else
|
|
{
|
|
ExtraIndirectSpecular += SkyLighting;
|
|
}
|
|
}
|
|
#endif
|
|
|
|
#if ALLOW_STATIC_LIGHTING
|
|
// Note: make sure this matches the lightmap mixing done on opaque (ReflectionEnvironmentTiledDeferredMain)
|
|
SpecularIBL.rgb *= ComputeMixingWeight(IndirectIrradiance, CompositedAverageBrightness.x, Roughness);
|
|
#endif
|
|
|
|
SpecularIBL.rgb += (1 - SpecularIBL.a) * ExtraIndirectSpecular;
|
|
|
|
// Factors derived from EnvBRDFApprox( SpecularColor, 1, 1 ) == SpecularColor * 0.4524 - 0.0024
|
|
float3 SpecularBounce = 0.45f * SpecularColor * IndirectIrradiance;
|
|
// Replace reflection captures with indirect diffuse when we're rendering to a reflection capture, to avoid a feedback loop
|
|
SpecularIBL.rgb = lerp(SpecularIBL.rgb, SpecularBounce, View.RenderingReflectionCaptureMask);
|
|
|
|
#if MATERIAL_SSR && !FORWARD_SHADING
|
|
if( View.CameraCut == 0 )
|
|
{
|
|
//uint ViewRandom = (uint)(View.TemporalAAParams.r * 1551);
|
|
uint ViewRandom = View.StateFrameIndexMod8 * 1551;
|
|
|
|
uint Morton = MortonCode( (uint)MaterialParameters.SvPosition.x & 3 ) | ( MortonCode( (uint)MaterialParameters.SvPosition.y & 3 ) * 2 );
|
|
uint PixelIndex = ReverseBits32( Morton ) >> 28;
|
|
//uint PixelIndex = ( (uint)MaterialParameters.SvPosition.x & 3 ) | ( ( (uint)MaterialParameters.SvPosition.y & 3 ) * 2 );
|
|
//PixelIndex = ( PixelIndex * 1551 ) & 15;
|
|
|
|
uint Offset = ( PixelIndex + ViewRandom ) & 15;
|
|
float StepOffset = Offset / 15.0;
|
|
StepOffset -= 0.5;
|
|
|
|
float4 HitUVzTime;
|
|
float HCBLevel;
|
|
|
|
RayCast(
|
|
HZBTexture, HZBSampler, float2(1, 1),
|
|
MaterialParameters.WorldPosition_CamRelative, RayDirection, 0, 0, MaterialParameters.ScreenPosition.w,
|
|
12, StepOffset,
|
|
HitUVzTime, HCBLevel
|
|
);
|
|
|
|
// if there was a hit
|
|
BRANCH if( HitUVzTime.w < 1 )
|
|
{
|
|
float4 SSR = SampleScreenColor( PrevSceneColor, PrevSceneColorSampler, HitUVzTime.xyz );
|
|
SSR *= saturate( 2 - 6.6 * Roughness );
|
|
SpecularIBL.rgb = SpecularIBL.rgb * (1 - SSR.a) + SSR.rgb;
|
|
}
|
|
}
|
|
#endif
|
|
|
|
|
|
SpecularColor = EnvBRDFApprox(SpecularColor, Roughness, NoV);
|
|
|
|
float3 SpecularLighting = SpecularIBL.rgb;
|
|
|
|
// Have to opt-in to receiving planar reflections with forward shading
|
|
#if !FORWARD_SHADING || MATERIAL_PLANAR_FORWARD_REFLECTIONS
|
|
// Plane normal will be zero if the feature is disabled
|
|
BRANCH
|
|
if (abs(dot(ReflectionPlane.xyz, 1)) > .0001f)
|
|
{
|
|
// Reuse ReflectionCubemapSampler to avoid reducing the sampler count available to artists
|
|
float4 PlanarReflection = ComputePlanarReflections(MaterialParameters.AbsoluteWorldPosition, MaterialParameters.WorldNormal, Roughness, ReflectionCubemapSampler);
|
|
// Planar reflections win over SSR and reflection environment
|
|
SpecularLighting = PlanarReflection.rgb + (1 - PlanarReflection.a) * SpecularLighting;
|
|
}
|
|
#endif
|
|
|
|
return SpecularLighting * SpecularColor;
|
|
}
|
|
#endif
|
|
|
|
void GetVolumeLightingNonDirectional(float4 AmbientLightingVector, float3 DiffuseColor, inout float3 InterpolatedLighting, out float4 VolumeLighting)
|
|
{
|
|
// Normal is not taken into account with non directional lighting, and only the ambient term of the SH coefficients are needed
|
|
FOneBandSHVectorRGB TranslucentLighting;
|
|
TranslucentLighting.R.V.x = AmbientLightingVector.r;
|
|
TranslucentLighting.G.V.x = AmbientLightingVector.g;
|
|
TranslucentLighting.B.V.x = AmbientLightingVector.b;
|
|
|
|
FOneBandSHVector DiffuseTransferSH = CalcDiffuseTransferSH1(1);
|
|
VolumeLighting = float4(DotSH1(TranslucentLighting, DiffuseTransferSH), AmbientLightingVector.a);
|
|
InterpolatedLighting = DiffuseColor * VolumeLighting.rgb;
|
|
}
|
|
|
|
void GetVolumeLightingDirectional(float4 AmbientLightingVector, float3 DirectionalLightingVector, float3 WorldNormal, float3 DiffuseColor, inout float3 InterpolatedLighting, out float4 VolumeLighting)
|
|
{
|
|
float DirectionalLightingIntensity = GetMaterialTranslucencyDirectionalLightingIntensity();
|
|
|
|
AmbientLightingVector.rgb /= DirectionalLightingIntensity;
|
|
DirectionalLightingVector.rgb *= DirectionalLightingIntensity;
|
|
|
|
// Reconstruct the SH coefficients based on what was encoded
|
|
FTwoBandSHVectorRGB TranslucentLighting;
|
|
TranslucentLighting.R.V.x = AmbientLightingVector.r;
|
|
TranslucentLighting.G.V.x = AmbientLightingVector.g;
|
|
TranslucentLighting.B.V.x = AmbientLightingVector.b;
|
|
float3 NormalizedAmbientColor = AmbientLightingVector.rgb / Luminance( AmbientLightingVector.rgb );
|
|
|
|
// Scale the monocrome directional coefficients with the normalzed ambient color as an approximation to the uncompressed values
|
|
TranslucentLighting.R.V.yzw = DirectionalLightingVector.rgb * NormalizedAmbientColor.r;
|
|
TranslucentLighting.G.V.yzw = DirectionalLightingVector.rgb * NormalizedAmbientColor.g;
|
|
TranslucentLighting.B.V.yzw = DirectionalLightingVector.rgb * NormalizedAmbientColor.b;
|
|
|
|
// Compute diffuse lighting which takes the normal into account
|
|
FTwoBandSHVector DiffuseTransferSH = CalcDiffuseTransferSH(WorldNormal, 1);
|
|
VolumeLighting = float4(max(half3(0,0,0), DotSH(TranslucentLighting, DiffuseTransferSH)), AmbientLightingVector.a);
|
|
InterpolatedLighting += DiffuseColor * VolumeLighting.rgb;
|
|
}
|
|
|
|
/** Calculates lighting for translucency. */
|
|
float3 GetTranslucencyLighting(
|
|
FMaterialPixelParameters MaterialParameters,
|
|
FPixelMaterialInputs PixelMaterialInputs,
|
|
FBasePassInterpolantsVSToPS BasePassInterpolants,
|
|
FGBufferData GBuffer,
|
|
float IndirectIrradiance)
|
|
{
|
|
float4 VolumeLighting;
|
|
float3 InterpolatedLighting = 0;
|
|
|
|
float3 InnerVolumeUVs;
|
|
float3 OuterVolumeUVs;
|
|
float FinalLerpFactor;
|
|
ComputeVolumeUVs(MaterialParameters.AbsoluteWorldPosition, MaterialParameters.LightingPositionOffset, InnerVolumeUVs, OuterVolumeUVs, FinalLerpFactor);
|
|
|
|
#if TRANSLUCENCY_LIGHTING_VOLUMETRIC_PERVERTEX_DIRECTIONAL
|
|
|
|
GetVolumeLightingDirectional(float4(BasePassInterpolants.AmbientLightingVector, 1), BasePassInterpolants.DirectionalLightingVector, MaterialParameters.WorldNormal, GBuffer.DiffuseColor, InterpolatedLighting, VolumeLighting);
|
|
|
|
#elif TRANSLUCENCY_LIGHTING_VOLUMETRIC_PERVERTEX_NONDIRECTIONAL
|
|
|
|
GetVolumeLightingNonDirectional(float4(BasePassInterpolants.AmbientLightingVector, 1), GBuffer.DiffuseColor, InterpolatedLighting, VolumeLighting);
|
|
|
|
#elif TRANSLUCENCY_LIGHTING_VOLUMETRIC_DIRECTIONAL || TRANSLUCENCY_LIGHTING_SURFACE
|
|
|
|
float4 AmbientLightingVector = GetAmbientLightingVectorFromTranslucentLightingVolume(InnerVolumeUVs, OuterVolumeUVs, FinalLerpFactor);
|
|
float3 DirectionalLightingVector = GetDirectionalLightingVectorFromTranslucentLightingVolume(InnerVolumeUVs, OuterVolumeUVs, FinalLerpFactor);
|
|
GetVolumeLightingDirectional(AmbientLightingVector, DirectionalLightingVector, MaterialParameters.WorldNormal, GBuffer.DiffuseColor, InterpolatedLighting, VolumeLighting);
|
|
|
|
#elif TRANSLUCENCY_LIGHTING_VOLUMETRIC_NONDIRECTIONAL
|
|
|
|
float4 AmbientLightingVector = GetAmbientLightingVectorFromTranslucentLightingVolume(InnerVolumeUVs, OuterVolumeUVs, FinalLerpFactor);
|
|
GetVolumeLightingNonDirectional(AmbientLightingVector, GBuffer.DiffuseColor, InterpolatedLighting, VolumeLighting);
|
|
|
|
#endif
|
|
|
|
#if (TRANSLUCENCY_LIGHTING_VOLUMETRIC_DIRECTIONAL || TRANSLUCENCY_LIGHTING_VOLUMETRIC_NONDIRECTIONAL || TRANSLUCENCY_LIGHTING_SURFACE) && TRANSLUCENT_SELF_SHADOWING
|
|
|
|
// Only apply self shadowing if the shadow hasn't faded out completely
|
|
if (DirectionalLightColor.a > 0)
|
|
{
|
|
// Determine the shadow space position
|
|
// Apply a stable offset to the world position used for shadowing, which blurs out high frequency details in the shadowmap with many layers
|
|
float4 HomogeneousShadowPosition = mul(float4(MaterialParameters.AbsoluteWorldPosition + MaterialParameters.LightingPositionOffset, 1), WorldToShadowMatrix);
|
|
float2 ShadowUVs = HomogeneousShadowPosition.xy / HomogeneousShadowPosition.w;
|
|
// Lookup the shadow density at the point being shaded
|
|
float3 ShadowDensity = CalculateTranslucencyShadowingDensity(ShadowUVs, HomogeneousShadowPosition.z) / GetMaterialTranslucentMultipleScatteringExtinction();
|
|
// Compute colored transmission based on the density that the light ray passed through
|
|
float3 SelfShadowing = saturate(exp(-ShadowDensity * GetMaterialTranslucentSelfShadowDensityScale()));
|
|
// Compute a second shadow gradient to add interesting information in the shadowed area of the first
|
|
// This is a stop gap for not having self shadowing from other light sources
|
|
float3 SelfShadowing2 = lerp(float3(1, 1, 1), saturate(exp(-ShadowDensity * GetMaterialTranslucentSelfShadowSecondDensityScale())), GetMaterialTranslucentSelfShadowSecondOpacity());
|
|
SelfShadowing = SelfShadowing * SelfShadowing2;
|
|
|
|
// Force unshadowed if we read outside the valid area of the shadowmap atlas
|
|
// This can happen if the particle system's bounds don't match its visible area
|
|
FLATTEN
|
|
if (any(ShadowUVs < ShadowUVMinMax.xy || ShadowUVs > ShadowUVMinMax.zw))
|
|
{
|
|
SelfShadowing = 1;
|
|
}
|
|
|
|
float3 BackscatteredLighting = 0;
|
|
|
|
#if MATERIAL_SHADINGMODEL_SUBSURFACE
|
|
|
|
float InScatterPower = GetMaterialTranslucentBackscatteringExponent();
|
|
// Setup a pow lobe to approximate anisotropic in-scattering near to the light direction
|
|
float InScattering = pow(saturate(dot(DirectionalLightDirection, MaterialParameters.CameraVector)), InScatterPower);
|
|
|
|
float4 SSData = GetMaterialSubsurfaceData(MaterialParameters);
|
|
float3 SubsurfaceColor = SSData.rgb;
|
|
|
|
BackscatteredLighting =
|
|
SubsurfaceColor
|
|
* InScattering
|
|
* DirectionalLightColor.rgb
|
|
// Energy normalization, tighter lobes should be brighter
|
|
* (InScatterPower + 2.0f) / 8.0f
|
|
// Mask by shadowing, exaggerated
|
|
* SelfShadowing * SelfShadowing
|
|
* VolumeLighting.a;
|
|
#endif
|
|
|
|
// The volume lighting already contains the contribution of the directional light,
|
|
// So calculate the amount of light to remove from the volume lighting in order to apply per-pixel self shadowing
|
|
// VolumeLighting.a stores all attenuation and opaque shadow factors
|
|
float3 SelfShadowingCorrection = DirectionalLightColor.rgb * VolumeLighting.a * (1 - SelfShadowing);
|
|
|
|
// Combine backscattering and directional light self shadowing
|
|
InterpolatedLighting = (BackscatteredLighting + GBuffer.DiffuseColor * max(VolumeLighting.rgb - SelfShadowingCorrection, 0));
|
|
}
|
|
|
|
#endif
|
|
|
|
#if TRANSLUCENCY_LIGHTING_SURFACE
|
|
InterpolatedLighting += GetImageBasedReflectionLighting(MaterialParameters, GBuffer.Roughness, GBuffer.SpecularColor, IndirectIrradiance);
|
|
#endif
|
|
|
|
return InterpolatedLighting;
|
|
}
|
|
|
|
#if SIMPLE_FORWARD_SHADING
|
|
#define GetEffectiveSkySHDiffuse GetSkySHDiffuseSimple
|
|
#else
|
|
#define GetEffectiveSkySHDiffuse GetSkySHDiffuse
|
|
#endif
|
|
|
|
/** Computes sky diffuse lighting, including precomputed shadowing. */
|
|
void GetSkyLighting(float3 WorldNormal, float2 LightmapUV, out float3 OutDiffuseLighting, out float3 OutSubsurfaceLighting)
|
|
{
|
|
OutDiffuseLighting = 0;
|
|
OutSubsurfaceLighting = 0;
|
|
|
|
#if ENABLE_SKY_LIGHT
|
|
|
|
float SkyVisibility = 1;
|
|
float GeometryTerm = 1;
|
|
float3 SkyLightingNormal = WorldNormal;
|
|
|
|
#if HQ_TEXTURE_LIGHTMAP || CACHED_POINT_INDIRECT_LIGHTING || CACHED_VOLUME_INDIRECT_LIGHTING
|
|
BRANCH
|
|
if (View.SkyLightParameters.x > 0)
|
|
{
|
|
#if HQ_TEXTURE_LIGHTMAP
|
|
|
|
// Bent normal from precomputed texture
|
|
float4 WorldSkyBentNormalAndOcclusion = GetSkyBentNormalAndOcclusion(LightmapUV * float2(1, 2));
|
|
// Renormalize as vector was quantized and compressed
|
|
float3 NormalizedBentNormal = normalize(WorldSkyBentNormalAndOcclusion.xyz);
|
|
SkyVisibility = WorldSkyBentNormalAndOcclusion.w;
|
|
|
|
#elif CACHED_POINT_INDIRECT_LIGHTING || CACHED_VOLUME_INDIRECT_LIGHTING
|
|
|
|
// Bent normal from the indirect lighting cache - one value for the whole object
|
|
float3 NormalizedBentNormal = PrecomputedLightingBuffer.PointSkyBentNormal.xyz;
|
|
SkyVisibility = PrecomputedLightingBuffer.PointSkyBentNormal.w;
|
|
|
|
#endif
|
|
|
|
#if (MATERIALBLENDING_TRANSLUCENT || MATERIALBLENDING_ADDITIVE) && (TRANSLUCENCY_LIGHTING_VOLUMETRIC_NONDIRECTIONAL || TRANSLUCENCY_LIGHTING_VOLUMETRIC_PERVERTEX_NONDIRECTIONAL)
|
|
// NonDirectional lighting can't depend on the normal
|
|
SkyLightingNormal = NormalizedBentNormal;
|
|
#else
|
|
|
|
// Weight toward the material normal to increase directionality
|
|
float BentNormalWeightFactor = 1 - (1 - SkyVisibility) * (1 - SkyVisibility);
|
|
|
|
// We are lerping between the inputs of two lighting scenarios based on occlusion
|
|
// In the mostly unoccluded case, evaluate sky lighting with the material normal, because it has higher detail
|
|
// In the mostly occluded case, evaluate sky lighting with the bent normal, because it is a better representation of the incoming lighting
|
|
// Then treat the lighting evaluated along the bent normal as an area light, so we must apply the lambert term
|
|
SkyLightingNormal = lerp(NormalizedBentNormal, WorldNormal, BentNormalWeightFactor);
|
|
|
|
float DotProductFactor = lerp(saturate(dot(NormalizedBentNormal, WorldNormal)), 1, BentNormalWeightFactor);
|
|
// Account for darkening due to the geometry term
|
|
GeometryTerm = DotProductFactor;
|
|
#endif
|
|
}
|
|
#endif
|
|
|
|
// Compute the preconvolved incoming lighting with the bent normal direction
|
|
float3 DiffuseLookup = GetEffectiveSkySHDiffuse(SkyLightingNormal) * ResolvedView.SkyLightColor.rgb;
|
|
|
|
// Apply AO to the sky diffuse
|
|
OutDiffuseLighting += DiffuseLookup * (SkyVisibility * GeometryTerm);
|
|
|
|
#if MATERIAL_SHADINGMODEL_TWOSIDED_FOLIAGE
|
|
float3 BackfaceDiffuseLookup = GetEffectiveSkySHDiffuse(-WorldNormal) * ResolvedView.SkyLightColor.rgb;
|
|
OutSubsurfaceLighting += BackfaceDiffuseLookup * SkyVisibility;
|
|
#endif
|
|
|
|
#endif
|
|
}
|
|
|
|
/** Calculates indirect lighting contribution on this object from precomputed data. */
|
|
void GetPrecomputedIndirectLightingAndSkyLight(
|
|
FMaterialPixelParameters MaterialParameters,
|
|
FVertexFactoryInterpolantsVSToPS Interpolants,
|
|
FBasePassInterpolantsVSToPS BasePassInterpolants,
|
|
float3 DiffuseDir,
|
|
out float3 OutDiffuseLighting,
|
|
out float3 OutSubsurfaceLighting,
|
|
out float OutIndirectIrradiance)
|
|
{
|
|
OutIndirectIrradiance = 0;
|
|
OutDiffuseLighting = 0;
|
|
OutSubsurfaceLighting = 0;
|
|
float2 SkyOcclusionUV = 0;
|
|
|
|
// Method for movable components which want to use a volume texture of interpolated SH samples
|
|
#if CACHED_VOLUME_INDIRECT_LIGHTING
|
|
|
|
// Compute volume teture UVs from world position
|
|
float3 VolumeUVs = MaterialParameters.AbsoluteWorldPosition * PrecomputedLightingBuffer.IndirectLightingCachePrimitiveScale + PrecomputedLightingBuffer.IndirectLightingCachePrimitiveAdd;
|
|
// Clamp UV to be within the valid region
|
|
// Pixels outside of the object's bounding box would read garbage otherwise
|
|
VolumeUVs = clamp(VolumeUVs, PrecomputedLightingBuffer.IndirectLightingCacheMinUV, PrecomputedLightingBuffer.IndirectLightingCacheMaxUV);
|
|
float4 Vector0 = Texture3DSample(PrecomputedLightingBuffer.IndirectLightingCacheTexture0, PrecomputedLightingBuffer.IndirectLightingCacheTextureSampler0, VolumeUVs);
|
|
|
|
// For debugging
|
|
#define AMBIENTONLY 0
|
|
#if AMBIENTONLY
|
|
|
|
OutDiffuseLighting = Vector0.rgb / SHAmbientFunction() / PI;
|
|
|
|
#else
|
|
|
|
float4 Vector1 = Texture3DSample(PrecomputedLightingBuffer.IndirectLightingCacheTexture1, PrecomputedLightingBuffer.IndirectLightingCacheTextureSampler1, VolumeUVs);
|
|
float4 Vector2 = Texture3DSample(PrecomputedLightingBuffer.IndirectLightingCacheTexture2, PrecomputedLightingBuffer.IndirectLightingCacheTextureSampler2, VolumeUVs);
|
|
|
|
// Construct the SH environment
|
|
FTwoBandSHVectorRGB CachedSH;
|
|
CachedSH.R.V = float4(Vector0.x, Vector1.x, Vector2.x, Vector0.w);
|
|
CachedSH.G.V = float4(Vector0.y, Vector1.y, Vector2.y, Vector1.w);
|
|
CachedSH.B.V = float4(Vector0.z, Vector1.z, Vector2.z, Vector2.w);
|
|
|
|
// Diffuse convolution
|
|
FTwoBandSHVector DiffuseTransferSH = CalcDiffuseTransferSH(DiffuseDir, 1);
|
|
OutDiffuseLighting = max(half3(0,0,0), DotSH(CachedSH, DiffuseTransferSH)) / PI;
|
|
|
|
#if MATERIAL_SHADINGMODEL_TWOSIDED_FOLIAGE
|
|
FTwoBandSHVector SubsurfaceTransferSH = CalcDiffuseTransferSH(-DiffuseDir, 1);
|
|
OutSubsurfaceLighting += max(half3(0,0,0), DotSH(CachedSH, SubsurfaceTransferSH)) / PI;
|
|
#endif
|
|
|
|
#endif
|
|
|
|
// Method for movable components which want to use a single interpolated SH sample
|
|
#elif CACHED_POINT_INDIRECT_LIGHTING
|
|
#if TRANSLUCENCY_LIGHTING_VOLUMETRIC_NONDIRECTIONAL
|
|
|
|
FOneBandSHVectorRGB PointIndirectLighting;
|
|
PointIndirectLighting.R.V = PrecomputedLightingBuffer.IndirectLightingSHCoefficients0[0].x;
|
|
PointIndirectLighting.G.V = PrecomputedLightingBuffer.IndirectLightingSHCoefficients0[1].x;
|
|
PointIndirectLighting.B.V = PrecomputedLightingBuffer.IndirectLightingSHCoefficients0[2].x;
|
|
|
|
FOneBandSHVector DiffuseTransferSH = CalcDiffuseTransferSH1(1);
|
|
OutDiffuseLighting = DotSH1(PointIndirectLighting, DiffuseTransferSH);
|
|
|
|
#if MATERIAL_SHADINGMODEL_TWOSIDED_FOLIAGE
|
|
FOneBandSHVector SubsurfaceTransferSH = CalcDiffuseTransferSH1(1);
|
|
OutSubsurfaceLighting += DotSH1(PointIndirectLighting, SubsurfaceTransferSH);
|
|
#endif
|
|
|
|
#else
|
|
|
|
FThreeBandSHVectorRGB PointIndirectLighting;
|
|
PointIndirectLighting.R.V0 = PrecomputedLightingBuffer.IndirectLightingSHCoefficients0[0];
|
|
PointIndirectLighting.R.V1 = PrecomputedLightingBuffer.IndirectLightingSHCoefficients1[0];
|
|
PointIndirectLighting.R.V2 = PrecomputedLightingBuffer.IndirectLightingSHCoefficients2[0];
|
|
|
|
PointIndirectLighting.G.V0 = PrecomputedLightingBuffer.IndirectLightingSHCoefficients0[1];
|
|
PointIndirectLighting.G.V1 = PrecomputedLightingBuffer.IndirectLightingSHCoefficients1[1];
|
|
PointIndirectLighting.G.V2 = PrecomputedLightingBuffer.IndirectLightingSHCoefficients2[1];
|
|
|
|
PointIndirectLighting.B.V0 = PrecomputedLightingBuffer.IndirectLightingSHCoefficients0[2];
|
|
PointIndirectLighting.B.V1 = PrecomputedLightingBuffer.IndirectLightingSHCoefficients1[2];
|
|
PointIndirectLighting.B.V2 = PrecomputedLightingBuffer.IndirectLightingSHCoefficients2[2];
|
|
|
|
FThreeBandSHVector DiffuseTransferSH = CalcDiffuseTransferSH3(DiffuseDir, 1);
|
|
// Compute diffuse lighting which takes the normal into account
|
|
OutDiffuseLighting = max(half3(0,0,0), DotSH3(PointIndirectLighting, DiffuseTransferSH));
|
|
|
|
#if MATERIAL_SHADINGMODEL_TWOSIDED_FOLIAGE
|
|
FThreeBandSHVector SubsurfaceTransferSH = CalcDiffuseTransferSH3(-DiffuseDir, 1);
|
|
OutSubsurfaceLighting += max(half3(0,0,0), DotSH3(PointIndirectLighting, SubsurfaceTransferSH));
|
|
#endif
|
|
|
|
#endif
|
|
|
|
// High quality texture lightmaps
|
|
#elif HQ_TEXTURE_LIGHTMAP
|
|
|
|
float2 LightmapUV0, LightmapUV1;
|
|
GetLightMapCoordinates(Interpolants, LightmapUV0, LightmapUV1);
|
|
SkyOcclusionUV = LightmapUV0;
|
|
GetLightMapColorHQ(LightmapUV0, LightmapUV1, DiffuseDir, OutDiffuseLighting, OutSubsurfaceLighting);
|
|
|
|
// Low quality texture lightmaps
|
|
#elif LQ_TEXTURE_LIGHTMAP
|
|
|
|
float2 LightmapUV0, LightmapUV1;
|
|
GetLightMapCoordinates(Interpolants, LightmapUV0, LightmapUV1);
|
|
OutDiffuseLighting = GetLightMapColorLQ(LightmapUV0, LightmapUV1, DiffuseDir).rgb;
|
|
|
|
#endif
|
|
|
|
// Apply indirect lighting scale while we have only accumulated lightmaps
|
|
OutDiffuseLighting *= View.IndirectLightingColorScale;
|
|
|
|
float3 SkyDiffuseLighting;
|
|
float3 SkySubsurfaceLighting;
|
|
GetSkyLighting(DiffuseDir, SkyOcclusionUV, SkyDiffuseLighting, SkySubsurfaceLighting);
|
|
|
|
OutSubsurfaceLighting += SkySubsurfaceLighting;
|
|
|
|
// Sky lighting must contribute to IndirectIrradiance for ReflectionEnvironment lightmap mixing
|
|
OutDiffuseLighting += SkyDiffuseLighting;
|
|
|
|
#if HQ_TEXTURE_LIGHTMAP || LQ_TEXTURE_LIGHTMAP || CACHED_VOLUME_INDIRECT_LIGHTING || CACHED_POINT_INDIRECT_LIGHTING
|
|
OutIndirectIrradiance = Luminance(OutDiffuseLighting);
|
|
#endif
|
|
}
|
|
|
|
#if SIMPLE_FORWARD_DIRECTIONAL_LIGHT
|
|
|
|
float3 GetSimpleForwardLightingDirectionalLight(FGBufferData GBuffer, float3 DiffuseColor, float3 SpecularColor, float Roughness, float3 WorldNormal, float3 CameraVector)
|
|
{
|
|
float3 V = -CameraVector;
|
|
float3 N = WorldNormal;
|
|
float3 L = ResolvedView.DirectionalLightDirection;
|
|
float NoL = saturate( dot( N, L ) );
|
|
|
|
float3 LightColor = ResolvedView.DirectionalLightColor.rgb * PI;
|
|
|
|
// Not computing specular, material was forced fully rough
|
|
float3 DiffuseLighting = StandardShading(DiffuseColor, SpecularColor, Roughness.xxx, 1.0f, L, V, N, float2(1, 0));
|
|
float3 SubsurfaceLighting = SubsurfaceShading(GBuffer, L, V, N, 1.0f, uint2(0, 0));
|
|
|
|
return LightColor * NoL * (DiffuseLighting + SubsurfaceLighting);
|
|
}
|
|
|
|
#endif
|
|
|
|
#if USE_EDITOR_COMPOSITING
|
|
bool bEnableEditorPrimitiveDepthTest;
|
|
int MSAASampleCount;
|
|
|
|
// depth in the red channel in DeviceZ
|
|
Texture2D FilteredSceneDepthTexture;
|
|
SamplerState FilteredSceneDepthTextureSampler;
|
|
#endif
|
|
|
|
// @return 0:translucent..1:opaque
|
|
float ClipForEditorPrimitives(FMaterialPixelParameters MaterialParameters)
|
|
{
|
|
float Ret = 1;
|
|
|
|
#if USE_EDITOR_COMPOSITING && (FEATURE_LEVEL >= FEATURE_LEVEL_SM4 || MOBILE_EMULATION)
|
|
// Depth test manually if compositing editor primitives since the depth buffer is different (MSAA only)
|
|
BRANCH
|
|
if (bEnableEditorPrimitiveDepthTest)
|
|
{
|
|
#if HAS_INVERTED_Z_BUFFER
|
|
//@todo-briank
|
|
bool bIsPerspective = (ResolvedView.ViewToClip._m33 < 1.0f);
|
|
#endif // HAS_INVERTED_Z_BUFFER
|
|
|
|
// dejitter the sample position and make a filtered lookup - for planes this allows to reconstruct a much less jittery depth comparison function, it however doesn't fix silhuetes
|
|
float DeviceZ = Texture2DSampleLevel(FilteredSceneDepthTexture, FilteredSceneDepthTextureSampler, (MaterialParameters.SvPosition.xy - View.TemporalAAParams.zw) * View.BufferSizeAndInvSize.zw, 0).r;
|
|
|
|
float PixelDeviceZ = MaterialParameters.SvPosition.z;
|
|
|
|
// Soft Bias with DeviceZ for best quality
|
|
const float DeviceDepthFade = 0.00005f;
|
|
|
|
// 0.5f is to bias around the actual value, 1 or 0 are another option
|
|
Ret = saturate(0.5f - (DeviceZ - PixelDeviceZ) / DeviceDepthFade);
|
|
}
|
|
#endif // USE_EDITOR_COMPOSITING && (FEATURE_LEVEL >= FEATURE_LEVEL_SM4 || MOBILE_EMULATION)
|
|
|
|
// Note: multiple returns cause strange HLSL compiler error for CV_Coverage in later code
|
|
return Ret;
|
|
}
|
|
|
|
|
|
#if EDITOR_ALPHA2COVERAGE != 0
|
|
uint CustomAlpha2Coverage(inout float4 InOutColor)
|
|
{
|
|
uint MaskedCoverage = 0xff;
|
|
|
|
MaskedCoverage = 0;
|
|
|
|
uint EnabledSampleCount = 1;
|
|
|
|
// todo: support non 4xMSAA as well
|
|
|
|
// conservatively on but can be 0 if the opacity is too low
|
|
if(InOutColor.a > 0.01f) { MaskedCoverage |= 0x1; }
|
|
if(InOutColor.a > 0.25f) { MaskedCoverage |= 0x2; ++EnabledSampleCount; }
|
|
if(InOutColor.a > 0.50f) { MaskedCoverage |= 0x4; ++EnabledSampleCount; }
|
|
if(InOutColor.a > 0.75f) { MaskedCoverage |= 0x8; ++EnabledSampleCount; }
|
|
|
|
// renormalize to make this sample the correct weight
|
|
InOutColor *= (float)MSAASampleCount / EnabledSampleCount;
|
|
|
|
return MaskedCoverage;
|
|
}
|
|
#endif
|
|
|
|
void ApplyPixelDepthOffsetForBasePass(inout FMaterialPixelParameters MaterialParameters, FPixelMaterialInputs PixelMaterialInputs, inout FBasePassInterpolantsVSToPS BasePassInterpolants, out float OutDepth)
|
|
{
|
|
float PixelDepthOffset = ApplyPixelDepthOffsetToMaterialParameters(MaterialParameters, PixelMaterialInputs, OutDepth);
|
|
|
|
#if WRITES_VELOCITY_TO_GBUFFER
|
|
BasePassInterpolants.VelocityPrevScreenPosition.w += PixelDepthOffset;
|
|
|
|
#if WRITES_VELOCITY_TO_GBUFFER_USE_POS_INTERPOLATOR
|
|
BasePassInterpolants.VelocityScreenPosition.w += PixelDepthOffset;
|
|
#endif
|
|
#endif
|
|
}
|
|
|
|
#if USES_GBUFFER
|
|
|
|
// The selective output mask can only depend on defines, since the shadow will not export the data.
|
|
uint GetSelectiveOutputMask()
|
|
{
|
|
uint Mask = 0;
|
|
#if !WRITES_CUSTOMDATA_TO_GBUFFER
|
|
Mask |= SKIP_CUSTOMDATA_MASK;
|
|
#endif
|
|
#if !WRITES_PRECSHADOWFACTOR_TO_GBUFFER
|
|
Mask |= SKIP_PRECSHADOW_MASK;
|
|
#endif
|
|
#if WRITES_PRECSHADOWFACTOR_ZERO
|
|
Mask |= ZERO_PRECSHADOW_MASK;
|
|
#endif
|
|
#if !WRITES_VELOCITY_TO_GBUFFER
|
|
Mask |= SKIP_VELOCITY_MASK;
|
|
#endif
|
|
return Mask;
|
|
}
|
|
#endif // USES_GBUFFER
|
|
|
|
// is called in MainPS() from PixelShaderOutputCommon.usf
|
|
void FPixelShaderInOut_MainPS(
|
|
FVertexFactoryInterpolantsVSToPS Interpolants,
|
|
FBasePassInterpolantsVSToPS BasePassInterpolants,
|
|
in FPixelShaderIn In,
|
|
inout FPixelShaderOut Out)
|
|
{
|
|
#if INSTANCED_STEREO
|
|
ResolvedView = ResolveView(GetEyeIndex(Interpolants.PackedEyeIndex));
|
|
#else
|
|
ResolvedView = ResolveView();
|
|
#endif
|
|
|
|
// Velocity
|
|
float4 OutVelocity = 0;
|
|
// CustomData
|
|
float4 OutGBufferD = 0;
|
|
// PreShadowFactor
|
|
float4 OutGBufferE = 0;
|
|
|
|
FMaterialPixelParameters MaterialParameters = GetMaterialPixelParameters(Interpolants, In.SvPosition);
|
|
FPixelMaterialInputs PixelMaterialInputs;
|
|
|
|
#if HQ_TEXTURE_LIGHTMAP && USES_AO_MATERIAL_MASK && !MATERIAL_SHADINGMODEL_UNLIT
|
|
float2 LightmapUV0, LightmapUV1;
|
|
GetLightMapCoordinates(Interpolants, LightmapUV0, LightmapUV1);
|
|
// Must be computed before BaseColor, Normal, etc are evaluated
|
|
MaterialParameters.AOMaterialMask = GetAOMaterialMask(LightmapUV0 * float2(1, 2));
|
|
#endif
|
|
|
|
float4 SVPositionForComputingPositions = In.SvPosition;
|
|
|
|
#if MATERIALBLENDING_ANY_TRANSLUCENT
|
|
// Modify SVPosition for passes that are rendering to a downsampled target so that ScreenPosition and WorldPosition will be computed correctly
|
|
//SVPositionForComputingPositions.xy *= DownsampleFactorFromSceneBufferSize;
|
|
#endif
|
|
|
|
#if USE_WORLD_POSITION_EXCLUDING_SHADER_OFFSETS
|
|
{
|
|
float4 ScreenPosition = SvPositionToResolvedScreenPosition(SVPositionForComputingPositions);
|
|
float3 TranslatedWorldPosition = SvPositionToResolvedTranslatedWorld(SVPositionForComputingPositions);
|
|
CalcMaterialParametersEx(MaterialParameters, PixelMaterialInputs, In.SvPosition, ScreenPosition, In.bIsFrontFace, TranslatedWorldPosition, BasePassInterpolants.PixelPositionExcludingWPO);
|
|
}
|
|
#else
|
|
{
|
|
float4 ScreenPosition = SvPositionToScreenPosition(SVPositionForComputingPositions);
|
|
float3 TranslatedWorldPosition = SvPositionToResolvedTranslatedWorld(SVPositionForComputingPositions);
|
|
CalcMaterialParametersEx(MaterialParameters, PixelMaterialInputs, In.SvPosition, ScreenPosition, In.bIsFrontFace, TranslatedWorldPosition, TranslatedWorldPosition);
|
|
}
|
|
#endif
|
|
|
|
#if USE_EDITOR_COMPOSITING && (FEATURE_LEVEL >= FEATURE_LEVEL_SM4 || MOBILE_EMULATION)
|
|
const bool bEditorWeightedZBuffering = true;
|
|
#else
|
|
const bool bEditorWeightedZBuffering = false;
|
|
#endif
|
|
|
|
#if OUTPUT_PIXEL_DEPTH_OFFSET
|
|
ApplyPixelDepthOffsetForBasePass(MaterialParameters, PixelMaterialInputs, BasePassInterpolants, Out.Depth);
|
|
#endif
|
|
|
|
//Clip if the blend mode requires it.
|
|
if(!bEditorWeightedZBuffering)
|
|
{
|
|
GetMaterialCoverageAndClipping(MaterialParameters, PixelMaterialInputs);
|
|
}
|
|
// Store the results in local variables and reuse instead of calling the functions multiple times.
|
|
half3 BaseColor = GetMaterialBaseColor(PixelMaterialInputs);
|
|
half Metallic = GetMaterialMetallic(PixelMaterialInputs);
|
|
half Specular = GetMaterialSpecular(PixelMaterialInputs);
|
|
|
|
float MaterialAO = GetMaterialAmbientOcclusion(PixelMaterialInputs);
|
|
float Roughness = GetMaterialRoughness(PixelMaterialInputs);
|
|
|
|
#if MATERIAL_NORMAL_CURVATURE_TO_ROUGHNESS && FORWARD_SHADING
|
|
float GeometricAARoughness = saturate(NormalCurvatureToRoughness(MaterialParameters.WorldNormal) * View.NormalCurvatureToRoughnessScaleBias.x + View.NormalCurvatureToRoughnessScaleBias.y);
|
|
Roughness = max(Roughness, GeometricAARoughness);
|
|
#endif
|
|
|
|
// 0..1, SubsurfaceProfileId = int(x * 255)
|
|
float SubsurfaceProfile = 0;
|
|
|
|
// If we don't use this shading model the color should be black (don't generate shader code for unused data, don't do indirectlighting cache lighting with this color).
|
|
float3 SubsurfaceColor = 0;
|
|
#if MATERIAL_SHADINGMODEL_SUBSURFACE || MATERIAL_SHADINGMODEL_PREINTEGRATED_SKIN || MATERIAL_SHADINGMODEL_SUBSURFACE_PROFILE || MATERIAL_SHADINGMODEL_TWOSIDED_FOLIAGE || MATERIAL_SHADINGMODEL_CLOTH
|
|
{
|
|
float4 SubsurfaceData = GetMaterialSubsurfaceData(MaterialParameters);
|
|
|
|
#if MATERIAL_SHADINGMODEL_SUBSURFACE || MATERIAL_SHADINGMODEL_PREINTEGRATED_SKIN || MATERIAL_SHADINGMODEL_TWOSIDED_FOLIAGE
|
|
SubsurfaceColor = SubsurfaceData.rgb * View.DiffuseOverrideParameter.w + View.DiffuseOverrideParameter.xyz;
|
|
#elif MATERIAL_SHADINGMODEL_CLOTH
|
|
SubsurfaceColor = SubsurfaceData.rgb;
|
|
#endif
|
|
SubsurfaceProfile = SubsurfaceData.a;
|
|
}
|
|
#endif
|
|
|
|
#if USE_DBUFFER && MATERIALDECALRESPONSEMASK && !MATERIALBLENDING_ANY_TRANSLUCENT
|
|
// apply decals from the DBuffer
|
|
#if SM5_PROFILE
|
|
//Temporary workaround to avoid crashes on AMD, revert back to BRANCH
|
|
FLATTEN
|
|
#else
|
|
BRANCH
|
|
#endif
|
|
if(Primitive.DecalReceiverMask > 0 && View.ShowDecalsMask > 0)
|
|
{
|
|
|
|
#if PLATFORM_SUPPORTS_RENDERTARGET_WRITE_MASK
|
|
uint RTWriteMaskBit = DecodeRTWriteMaskTexture(In.SvPosition.xy, DBufferMask);
|
|
if(RTWriteMaskBit)
|
|
#endif
|
|
{
|
|
float2 NDC = MaterialParameters.ScreenPosition.xy / MaterialParameters.ScreenPosition.w;
|
|
|
|
// Note: We are using View and not ResolvedView here.
|
|
// It has the correct ScreenPositionScaleBias values for screen space compositing.
|
|
float2 ScreenUV = NDC * View.ScreenPositionScaleBias.xy + View.ScreenPositionScaleBias.wz;
|
|
|
|
FDBufferData DBufferData = GetDBufferData(ScreenUV);
|
|
|
|
// the material can disable the DBuffer effects for better performance or control
|
|
if((MATERIALDECALRESPONSEMASK & 0x1) == 0) { DBufferData.PreMulColor = 0; DBufferData.ColorOpacity = 1; }
|
|
if((MATERIALDECALRESPONSEMASK & 0x2) == 0) { DBufferData.PreMulWorldNormal = 0; DBufferData.NormalOpacity = 1; }
|
|
if((MATERIALDECALRESPONSEMASK & 0x4) == 0) { DBufferData.PreMulRoughness = 0; DBufferData.RoughnessOpacity = 1; }
|
|
ApplyDBufferData(DBufferData, MaterialParameters.WorldNormal, SubsurfaceColor, Roughness, BaseColor, Metallic, Specular);
|
|
}
|
|
}
|
|
#endif
|
|
|
|
float3 LocalBaseColor = BaseColor;
|
|
float LocalSpecular = Specular;
|
|
|
|
#if MATERIAL_SHADINGMODEL_SUBSURFACE_PROFILE && USES_GBUFFER
|
|
// SubsurfaceProfile applies the BaseColor in a later pass. Any lighting output in the base pass needs
|
|
// to separate specualar and diffuse lighting in a checkerboard pattern
|
|
{
|
|
bool bChecker = CheckerFromPixelPos(MaterialParameters.SvPosition.xy);
|
|
|
|
AdjustBaseColorAndSpecularForCheckerDeferredLighting(LocalBaseColor, LocalSpecular, bChecker);
|
|
}
|
|
#endif
|
|
|
|
half Opacity = GetMaterialOpacity(PixelMaterialInputs);
|
|
|
|
FGBufferData GBuffer = (FGBufferData)0;
|
|
|
|
GBuffer.WorldNormal = MaterialParameters.WorldNormal;
|
|
GBuffer.BaseColor = BaseColor;
|
|
GBuffer.Metallic = Metallic;
|
|
GBuffer.Specular = Specular;
|
|
GBuffer.Roughness = Roughness;
|
|
GBuffer.GBufferAO = MaterialAO;
|
|
GBuffer.PerObjectGBufferData = Primitive.PerObjectGBufferData;
|
|
GBuffer.Depth = MaterialParameters.ScreenPosition.w;
|
|
|
|
GBuffer.PrecomputedShadowFactors = GetPrecomputedShadowMasks(Interpolants);
|
|
|
|
#if MATERIAL_SHADINGMODEL_UNLIT
|
|
GBuffer.ShadingModelID = SHADINGMODELID_UNLIT;
|
|
#elif MATERIAL_SHADINGMODEL_DEFAULT_LIT
|
|
GBuffer.ShadingModelID = SHADINGMODELID_DEFAULT_LIT;
|
|
#elif MATERIAL_SHADINGMODEL_SUBSURFACE
|
|
GBuffer.ShadingModelID = SHADINGMODELID_SUBSURFACE;
|
|
GBuffer.CustomData.rgb = EncodeSubsurfaceColor(SubsurfaceColor);
|
|
GBuffer.CustomData.a = Opacity;
|
|
#elif MATERIAL_SHADINGMODEL_PREINTEGRATED_SKIN
|
|
GBuffer.ShadingModelID = SHADINGMODELID_PREINTEGRATED_SKIN;
|
|
GBuffer.CustomData.rgb = EncodeSubsurfaceColor(SubsurfaceColor);
|
|
GBuffer.CustomData.a = Opacity;
|
|
#elif MATERIAL_SHADINGMODEL_SUBSURFACE_PROFILE
|
|
GBuffer.ShadingModelID = SHADINGMODELID_SUBSURFACE_PROFILE;
|
|
GBuffer.CustomData.rgb = EncodeSubsurfaceProfile(SubsurfaceProfile);
|
|
GBuffer.CustomData.a = Opacity;
|
|
#elif MATERIAL_SHADINGMODEL_CLEAR_COAT
|
|
{
|
|
GBuffer.ShadingModelID = SHADINGMODELID_CLEAR_COAT;
|
|
|
|
float ClearCoat = saturate( GetMaterialCustomData0(MaterialParameters) );
|
|
float ClearCoatRoughness = saturate( GetMaterialCustomData1(MaterialParameters) );
|
|
float MetalSpec = 0.9;
|
|
|
|
float NoV = saturate( dot( MaterialParameters.WorldNormal, MaterialParameters.CameraVector ) );
|
|
|
|
// Approximation of refraction's effect on EnvBRDF
|
|
float RefractionScale = ( (NoV * 0.5 + 0.5) * NoV - 1 ) * saturate( 1.25 - 1.25 * Roughness ) + 1;
|
|
|
|
// Approximation of absorption integral, tuned for Roughness=0.4
|
|
float3 AbsorptionColor = BaseColor * (1 / MetalSpec);
|
|
float3 Absorption = AbsorptionColor * ( (NoV - 1) * 0.85 * ( 1 - lerp( AbsorptionColor, Square(AbsorptionColor), -0.78 ) ) + 1 );
|
|
|
|
GBuffer.BaseColor = lerp( BaseColor, lerp( BaseColor, MetalSpec * Absorption, Metallic ) * RefractionScale, ClearCoat );
|
|
GBuffer.Specular *= lerp( 1, RefractionScale, ClearCoat );
|
|
|
|
GBuffer.CustomData.x = ClearCoat;
|
|
GBuffer.CustomData.y = ClearCoatRoughness;
|
|
|
|
#if CLEAR_COAT_BOTTOM_NORMAL
|
|
{
|
|
float2 oct2 = UnitVectorToOctahedron(GBuffer.WorldNormal);
|
|
|
|
#if NUM_MATERIAL_OUTPUTS_CLEARCOATBOTTOMNORMAL > 0
|
|
#if MATERIAL_TANGENTSPACENORMAL
|
|
float3 tempnormal = normalize(TransformTangentVectorToWorld( MaterialParameters.TangentToWorld, ClearCoatBottomNormal0(MaterialParameters) ));
|
|
#else
|
|
float3 tempnormal = ClearCoatBottomNormal0(MaterialParameters);
|
|
#endif
|
|
|
|
float2 oct1 = UnitVectorToOctahedron(tempnormal);
|
|
float2 oct3 = ( (oct1 - oct2) * 0.5 ) + (128.0/255.0);
|
|
GBuffer.CustomData.a = oct3.x;
|
|
GBuffer.CustomData.z = oct3.y;
|
|
#else
|
|
GBuffer.CustomData.a = 128.0/255.0;
|
|
GBuffer.CustomData.z = 128.0/255.0;
|
|
#endif
|
|
}
|
|
#endif
|
|
}
|
|
#elif MATERIAL_SHADINGMODEL_TWOSIDED_FOLIAGE
|
|
GBuffer.ShadingModelID = SHADINGMODELID_TWOSIDED_FOLIAGE;
|
|
GBuffer.CustomData.rgb = EncodeSubsurfaceColor(SubsurfaceColor);
|
|
GBuffer.CustomData.a = Opacity;
|
|
#elif MATERIAL_SHADINGMODEL_HAIR
|
|
GBuffer.ShadingModelID = SHADINGMODELID_HAIR;
|
|
GBuffer.CustomData.xy = UnitVectorToOctahedron( MaterialParameters.WorldNormal ) * 0.5 + 0.5;
|
|
GBuffer.CustomData.z = saturate( GetMaterialCustomData0(MaterialParameters) ); // Backlit
|
|
#elif MATERIAL_SHADINGMODEL_CLOTH
|
|
GBuffer.ShadingModelID = SHADINGMODELID_CLOTH;
|
|
GBuffer.CustomData.rgb = SubsurfaceColor;
|
|
GBuffer.CustomData.a = saturate( GetMaterialCustomData0(MaterialParameters) ); // Cloth
|
|
GBuffer.IndirectIrradiance *= 1 - GBuffer.CustomData.a;
|
|
#elif MATERIAL_SHADINGMODEL_EYE
|
|
GBuffer.ShadingModelID = SHADINGMODELID_EYE;
|
|
#if NUM_MATERIAL_OUTPUTS_GETTANGENTOUTPUT > 0
|
|
float3 Tangent = GetTangentOutput0(MaterialParameters);
|
|
GBuffer.CustomData.xy = UnitVectorToOctahedron( normalize(Tangent) ) * 0.5 + 0.5;
|
|
#endif
|
|
GBuffer.CustomData.z = saturate( GetMaterialCustomData0(MaterialParameters) ); // Iris Mask
|
|
GBuffer.CustomData.w = saturate( GetMaterialCustomData1(MaterialParameters) ); // Iris Distance
|
|
#else
|
|
// missing shading model, compiler should report ShadingModelID is not set
|
|
#endif
|
|
|
|
#if USES_GBUFFER
|
|
GBuffer.SelectiveOutputMask = GetSelectiveOutputMask();
|
|
|
|
#if WRITES_VELOCITY_TO_GBUFFER
|
|
{
|
|
// 2d velocity, includes camera an object motion
|
|
#if WRITES_VELOCITY_TO_GBUFFER_USE_POS_INTERPOLATOR
|
|
float2 Velocity = Calculate2DVelocity(BasePassInterpolants.VelocityScreenPosition, BasePassInterpolants.VelocityPrevScreenPosition);
|
|
#else
|
|
float2 Velocity = Calculate2DVelocity(MaterialParameters.ScreenPosition, BasePassInterpolants.VelocityPrevScreenPosition);
|
|
#endif
|
|
|
|
// Make sure not to touch 0,0 which is clear color
|
|
GBuffer.Velocity = float4(EncodeVelocityToTexture(Velocity), 0, 0) * BasePassInterpolants.VelocityPrevScreenPosition.z;
|
|
}
|
|
#else
|
|
GBuffer.Velocity = 0;
|
|
#endif
|
|
#endif
|
|
|
|
// So that the following code can still use DiffuseColor and SpecularColor.
|
|
GBuffer.DiffuseColor = LocalBaseColor - LocalBaseColor * Metallic;
|
|
GBuffer.SpecularColor = lerp( 0.08 * LocalSpecular.xxx, LocalBaseColor, Metallic.xxx );
|
|
|
|
#if USE_DEVELOPMENT_SHADERS
|
|
{
|
|
// this feature is only needed for development/editor - we can compile it out for a shipping build (see r.CompileShadersForDevelopment cvar help)
|
|
GBuffer.DiffuseColor = GBuffer.DiffuseColor * View.DiffuseOverrideParameter.w + View.DiffuseOverrideParameter.xyz;
|
|
GBuffer.SpecularColor = GBuffer.SpecularColor * View.SpecularOverrideParameter.w + View.SpecularOverrideParameter.xyz;
|
|
}
|
|
#endif
|
|
|
|
#if FORCE_FULLY_ROUGH
|
|
// Factors derived from EnvBRDFApprox( SpecularColor, 1, 1 ) == SpecularColor * 0.4524 - 0.0024
|
|
GBuffer.DiffuseColor += GBuffer.SpecularColor * 0.45;
|
|
GBuffer.SpecularColor = 0;
|
|
GBuffer.Roughness = 1;
|
|
#endif
|
|
|
|
half3 Color = 0;
|
|
float IndirectIrradiance = 0;
|
|
|
|
#if !MATERIAL_SHADINGMODEL_UNLIT
|
|
|
|
float3 DiffuseDir = MaterialParameters.WorldNormal;
|
|
float3 DiffuseColorForIndirect = GBuffer.DiffuseColor;
|
|
|
|
#if MATERIAL_SHADINGMODEL_SUBSURFACE || MATERIAL_SHADINGMODEL_PREINTEGRATED_SKIN
|
|
// Add subsurface energy to diffuse
|
|
//@todo - better subsurface handling for these shading models with skylight and precomputed GI
|
|
DiffuseColorForIndirect += SubsurfaceColor;
|
|
#endif
|
|
|
|
#if MATERIAL_SHADINGMODEL_CLOTH
|
|
DiffuseColorForIndirect += SubsurfaceColor * saturate( GetMaterialCustomData0(MaterialParameters) );
|
|
#endif
|
|
|
|
#if MATERIAL_SHADINGMODEL_HAIR
|
|
{
|
|
float3 N = MaterialParameters.WorldNormal;
|
|
float3 V = MaterialParameters.CameraVector;
|
|
float3 L = normalize( V - N * dot(V,N) );
|
|
DiffuseDir = L;
|
|
DiffuseColorForIndirect = 2*PI * HairShading( GBuffer, L, V, N, 1, 0, 0.2, uint2(0,0) );
|
|
}
|
|
#endif
|
|
|
|
float3 DiffuseIndirectLighting;
|
|
float3 SubsurfaceIndirectLighting;
|
|
GetPrecomputedIndirectLightingAndSkyLight(MaterialParameters, Interpolants, BasePassInterpolants, DiffuseDir, DiffuseIndirectLighting, SubsurfaceIndirectLighting, IndirectIrradiance);
|
|
|
|
float IndirectOcclusion = 1.0f;
|
|
|
|
#if FORWARD_SHADING && (MATERIALBLENDING_SOLID || MATERIALBLENDING_MASKED)
|
|
IndirectOcclusion = GetIndirectOcclusion(MaterialParameters.ScreenPosition);
|
|
DiffuseIndirectLighting *= IndirectOcclusion;
|
|
SubsurfaceIndirectLighting *= IndirectOcclusion;
|
|
IndirectIrradiance *= IndirectOcclusion;
|
|
#endif
|
|
|
|
Color += (DiffuseIndirectLighting * DiffuseColorForIndirect + SubsurfaceIndirectLighting * SubsurfaceColor) * MaterialAO;
|
|
|
|
#if FORWARD_SHADING || TRANSLUCENCY_LIGHTING_SURFACE_PERPIXEL
|
|
#if LIGHT_GRID_FORWARD_SHADING
|
|
Color += GetForwardDirectLighting(MaterialParameters.SvPosition, MaterialParameters.ScreenPosition, MaterialParameters.AbsoluteWorldPosition, MaterialParameters.CameraVector, GBuffer);
|
|
#endif
|
|
|
|
Color += GetImageBasedReflectionLighting(MaterialParameters, GBuffer.Roughness, GBuffer.SpecularColor, IndirectIrradiance) * IndirectOcclusion;
|
|
#endif
|
|
|
|
#if SIMPLE_FORWARD_DIRECTIONAL_LIGHT
|
|
|
|
float3 DirectionalLighting = GetSimpleForwardLightingDirectionalLight(
|
|
GBuffer,
|
|
DiffuseColorForIndirect,
|
|
GBuffer.SpecularColor,
|
|
Roughness,
|
|
MaterialParameters.WorldNormal,
|
|
MaterialParameters.CameraVector);
|
|
|
|
#if STATICLIGHTING_SIGNEDDISTANCEFIELD
|
|
DirectionalLighting *= GBuffer.PrecomputedShadowFactors.x;
|
|
#elif CACHED_POINT_INDIRECT_LIGHTING
|
|
DirectionalLighting *= PrecomputedLightingBuffer.DirectionalLightShadowing;
|
|
#endif
|
|
|
|
Color += DirectionalLighting;
|
|
|
|
#endif
|
|
#endif
|
|
|
|
#if NEEDS_BASEPASS_FOGGING
|
|
float4 VertexFog = BasePassInterpolants.VertexFog;
|
|
#else
|
|
float4 VertexFog = float4(0,0,0,1);
|
|
#endif
|
|
|
|
// Volume lighting for lit translucency
|
|
#if (MATERIAL_SHADINGMODEL_DEFAULT_LIT || MATERIAL_SHADINGMODEL_SUBSURFACE) && (MATERIALBLENDING_TRANSLUCENT || MATERIALBLENDING_ADDITIVE) && !SIMPLE_FORWARD_SHADING && !FORWARD_SHADING
|
|
Color += GetTranslucencyLighting(MaterialParameters, PixelMaterialInputs, BasePassInterpolants, GBuffer, IndirectIrradiance);
|
|
#endif
|
|
|
|
#if !MATERIAL_SHADINGMODEL_UNLIT && USE_DEVELOPMENT_SHADERS
|
|
Color = lerp(Color, GBuffer.DiffuseColor + GBuffer.SpecularColor * 0.45, View.UnlitViewmodeMask);
|
|
#endif
|
|
|
|
half3 Emissive = GetMaterialEmissive(PixelMaterialInputs);
|
|
|
|
#if USE_DEVELOPMENT_SHADERS
|
|
// this feature is only needed for development/editor - we can compile it out for a shipping build (see r.CompileShadersForDevelopment cvar help)
|
|
#if METAL_SM5_PROFILE || SM5_PROFILE || SM4_PROFILE || METAL_SM4_PROFILE
|
|
BRANCH
|
|
if (View.OutOfBoundsMask > 0)
|
|
{
|
|
if (any(abs(MaterialParameters.AbsoluteWorldPosition - Primitive.ObjectWorldPositionAndRadius.xyz) > Primitive.ObjectBounds + 1))
|
|
{
|
|
float Gradient = frac(dot(MaterialParameters.AbsoluteWorldPosition, float3(.577f, .577f, .577f)) / 500.0f);
|
|
Emissive = lerp(float3(1,1,0), float3(0,1,1), Gradient.xxx > .5f);
|
|
Opacity = 1;
|
|
}
|
|
}
|
|
#endif
|
|
#endif
|
|
|
|
Color += Emissive;
|
|
|
|
#if MATERIALBLENDING_TRANSLUCENT
|
|
Out.MRT[0] = half4(Color * VertexFog.a + VertexFog.rgb, Opacity);
|
|
Out.MRT[0] = RETURN_COLOR(Out.MRT[0]);
|
|
#elif MATERIALBLENDING_ADDITIVE
|
|
Out.MRT[0] = half4(Color * VertexFog.a * Opacity, 0.0f);
|
|
Out.MRT[0] = RETURN_COLOR(Out.MRT[0]);
|
|
#elif MATERIALBLENDING_MODULATE
|
|
// RETURN_COLOR not needed with modulative blending
|
|
half3 FoggedColor = lerp(float3(1, 1, 1), Color, VertexFog.aaa * VertexFog.aaa);
|
|
Out.MRT[0] = half4(FoggedColor, Opacity);
|
|
#else
|
|
{
|
|
FLightAccumulator LightAccumulator = (FLightAccumulator)0;
|
|
|
|
LightAccumulator_Add(LightAccumulator, Color * VertexFog.a + VertexFog.rgb, 0, 1.0f);
|
|
Out.MRT[0] = RETURN_COLOR(LightAccumulator_GetResult(LightAccumulator));
|
|
|
|
#if SUBSURFACE_CHANNEL_MODE == 0
|
|
// Scene captures and planar reflections need opacity in destination alpha.
|
|
Out.MRT[0].a = 0;
|
|
#endif
|
|
}
|
|
#endif
|
|
|
|
#if USES_GBUFFER
|
|
GBuffer.IndirectIrradiance = IndirectIrradiance;
|
|
|
|
// -0.5 .. 0.5, could be optimzed as lower quality noise would be sufficient
|
|
float QuantizationBias = PseudoRandom( MaterialParameters.SvPosition.xy ) - 0.5f;
|
|
EncodeGBuffer(GBuffer, Out.MRT[1], Out.MRT[2], Out.MRT[3], OutGBufferD, OutGBufferE, OutVelocity, QuantizationBias);
|
|
#endif
|
|
|
|
if(bEditorWeightedZBuffering)
|
|
{
|
|
Out.MRT[0].a = 1;
|
|
|
|
#if MATERIALBLENDING_MASKED
|
|
// some material might have a opacity value
|
|
Out.MRT[0].a = GetMaterialMaskInputRaw(PixelMaterialInputs);
|
|
#endif
|
|
// we output premultiplied alpha to we have to darken all 4 channels
|
|
Out.MRT[0] *= ClipForEditorPrimitives(MaterialParameters);
|
|
|
|
#if EDITOR_ALPHA2COVERAGE != 0
|
|
// per MSAA sample
|
|
if(MSAASampleCount > 1)
|
|
{
|
|
Out.Coverage = In.Coverage & CustomAlpha2Coverage(Out.MRT[0]);
|
|
}
|
|
else
|
|
{
|
|
// no MSAA is handle like per pixel
|
|
clip(Out.MRT[0].a - GetMaterialOpacityMaskClipValue());
|
|
}
|
|
#else
|
|
// per pixel
|
|
clip(Out.MRT[0].a - GetMaterialOpacityMaskClipValue());
|
|
#endif
|
|
}
|
|
|
|
#if USES_GBUFFER
|
|
// extra starts at MRT4 (see PIXELSHADEROUTPUT_MRT3)
|
|
int MRT_Write_Index = 4;
|
|
|
|
#if WRITES_VELOCITY_TO_GBUFFER
|
|
Out.MRT[MRT_Write_Index++] = OutVelocity;
|
|
#endif
|
|
|
|
// we don't test for WRITES_CUSTOMDATA_TO_GBUFFER because GetGBufferRenderTargets() is not masking the MRT based on it, we mask by GetSelectiveOutputMask instead
|
|
Out.MRT[MRT_Write_Index++] = OutGBufferD;
|
|
|
|
#if WRITES_PRECSHADOWFACTOR_TO_GBUFFER
|
|
Out.MRT[MRT_Write_Index++] = OutGBufferE;
|
|
#endif
|
|
|
|
#endif
|
|
}
|
|
|
|
|
|
// Two options control the render target bindings : OUTPUT_GBUFFER_VELOCITY and ALLOW_STATIC_LIGHTING
|
|
// Big issue here on PC with AMD hardware : putting the velocity buffer after an optional RT fails!!!
|
|
#define VELOCITY_OUTPUT ((USES_GBUFFER) && (WRITES_VELOCITY_TO_GBUFFER))
|
|
// we don't test for WRITES_CUSTOMDATA_TO_GBUFFER because GetGBufferRenderTargets() is not masking the MRT based on it, we mask by GetSelectiveOutputMask instead
|
|
#define CUSTOMDATA_OUTPUT (USES_GBUFFER)
|
|
#define PRESHADOWFACTOR_OUTPUT ((USES_GBUFFER) && (WRITES_PRECSHADOWFACTOR_TO_GBUFFER))
|
|
|
|
#define EXTRA_MRT_COUNT (VELOCITY_OUTPUT + CUSTOMDATA_OUTPUT + PRESHADOWFACTOR_OUTPUT)
|
|
|
|
// the following needs to match to the code in FSceneRenderTargets::GetGBufferRenderTargets()
|
|
|
|
#define PIXELSHADEROUTPUT_BASEPASS 1
|
|
#define PIXELSHADEROUTPUT_MRT0 (1)
|
|
#define PIXELSHADEROUTPUT_MRT1 (USES_GBUFFER)
|
|
#define PIXELSHADEROUTPUT_MRT2 (USES_GBUFFER)
|
|
#define PIXELSHADEROUTPUT_MRT3 (USES_GBUFFER)
|
|
#define PIXELSHADEROUTPUT_MRT4 (EXTRA_MRT_COUNT > 0)
|
|
#define PIXELSHADEROUTPUT_MRT5 (EXTRA_MRT_COUNT > 1)
|
|
#define PIXELSHADEROUTPUT_MRT6 (EXTRA_MRT_COUNT > 2)
|
|
#define PIXELSHADEROUTPUT_A2C ((EDITOR_ALPHA2COVERAGE) != 0)
|
|
// all PIXELSHADEROUTPUT_ and "void FPixelShaderInOut_MainPS()" need to be setup before this include
|
|
// this include generates the wrapper code to call MainPS(inout FPixelShaderOutput PixelShaderOutput)
|
|
#include "PixelShaderOutputCommon.usf"
|