Why do different render engines generate different z pass?Cycles generates distorted depthPrecision of Z-coordinateMake an object's visibility controlled by another object or an emptyCreating depth of field from Beauty+Depth passes (rendered in another software)data formatting of Z-bufferIs it possible to create a render layer with no depth of field?Z-Buffer rendering issuesHow to get the right values for Z depth renderingDepth in meters for rendered imagesUninterpretablity in depth maps: why the pixel values do not necessarily represent a valid distance?Z-Buffer render gives unexpected resultsCycles generates distorted depthRendering depth map that is linear, aliased, normalized

Why would a jet engine that runs at temps excess of 2000°C burn when it crashes?

Time travel short story where dinosaur doesn't taste like chicken

Why the color red for the Republican Party

Unreachable code, but reachable with exception

PTIJ: Why can't I eat anything?

Who deserves to be first and second author? PhD student who collected data, research associate who wrote the paper or supervisor?

Rejected in 4th interview round citing insufficient years of experience

What Happens when Passenger Refuses to Fly Boeing 737 Max?

Space in array system equations

How did Alan Turing break the enigma code using the hint given by the lady in the bar?

Do f-stop and exposure time perfectly cancel?

Good for you! in Russian

Offered promotion but I'm leaving. Should I tell?

Can't find the Shader/UVs tab

How do you like my writing?

Single word request: Harming the benefactor

Accountant/ lawyer will not return my call

Make a transparent 448*448 image

What are the best books to study Neural Networks from a purely mathematical perspective?

Why does the negative sign arise in this thermodynamic relation?

The bar has been raised

Built-In Shelves/Bookcases - IKEA vs Built

Should QA ask requirements to developers?

Is Gradient Descent central to every optimizer?



Why do different render engines generate different z pass?


Cycles generates distorted depthPrecision of Z-coordinateMake an object's visibility controlled by another object or an emptyCreating depth of field from Beauty+Depth passes (rendered in another software)data formatting of Z-bufferIs it possible to create a render layer with no depth of field?Z-Buffer rendering issuesHow to get the right values for Z depth renderingDepth in meters for rendered imagesUninterpretablity in depth maps: why the pixel values do not necessarily represent a valid distance?Z-Buffer render gives unexpected resultsCycles generates distorted depthRendering depth map that is linear, aliased, normalized













5












$begingroup$


I've been using Blender to generate depth maps using z pass. I notice that the z pass generated by different render engines are different, which made me a bit confused. My feeling is that the z pass generated by Cycle render denotes distance from a given pixel to the camera center while Blender render generates an orthogonal distance to the camera plane. Do I understand it correctly? Is so, is there a way to change such behavior for both render modes?



As an example, the bottom area of the model is a flat surface, to which the camera is perpendicularly pointed. Below are the nodes I use to normalize and visualize the z pass data (Viewer node is used to save the depth map).



Nodes to normalize and view



Z pass with Cycles render:



Z pass with Cycles render



Z pass with blender render (everywhere same value in the bottom part):



Z pass with blender render










share|improve this question







New contributor




DingLuo is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.







$endgroup$











  • $begingroup$
    Read this related link: Precision of z coordinate
    $endgroup$
    – cegaton
    11 hours ago










  • $begingroup$
    Also related: Cycles generates distorted depth
    $endgroup$
    – cegaton
    11 hours ago















5












$begingroup$


I've been using Blender to generate depth maps using z pass. I notice that the z pass generated by different render engines are different, which made me a bit confused. My feeling is that the z pass generated by Cycle render denotes distance from a given pixel to the camera center while Blender render generates an orthogonal distance to the camera plane. Do I understand it correctly? Is so, is there a way to change such behavior for both render modes?



As an example, the bottom area of the model is a flat surface, to which the camera is perpendicularly pointed. Below are the nodes I use to normalize and visualize the z pass data (Viewer node is used to save the depth map).



Nodes to normalize and view



Z pass with Cycles render:



Z pass with Cycles render



Z pass with blender render (everywhere same value in the bottom part):



Z pass with blender render










share|improve this question







New contributor




DingLuo is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.







$endgroup$











  • $begingroup$
    Read this related link: Precision of z coordinate
    $endgroup$
    – cegaton
    11 hours ago










  • $begingroup$
    Also related: Cycles generates distorted depth
    $endgroup$
    – cegaton
    11 hours ago













5












5








5





$begingroup$


I've been using Blender to generate depth maps using z pass. I notice that the z pass generated by different render engines are different, which made me a bit confused. My feeling is that the z pass generated by Cycle render denotes distance from a given pixel to the camera center while Blender render generates an orthogonal distance to the camera plane. Do I understand it correctly? Is so, is there a way to change such behavior for both render modes?



As an example, the bottom area of the model is a flat surface, to which the camera is perpendicularly pointed. Below are the nodes I use to normalize and visualize the z pass data (Viewer node is used to save the depth map).



Nodes to normalize and view



Z pass with Cycles render:



Z pass with Cycles render



Z pass with blender render (everywhere same value in the bottom part):



Z pass with blender render










share|improve this question







New contributor




DingLuo is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.







$endgroup$




I've been using Blender to generate depth maps using z pass. I notice that the z pass generated by different render engines are different, which made me a bit confused. My feeling is that the z pass generated by Cycle render denotes distance from a given pixel to the camera center while Blender render generates an orthogonal distance to the camera plane. Do I understand it correctly? Is so, is there a way to change such behavior for both render modes?



As an example, the bottom area of the model is a flat surface, to which the camera is perpendicularly pointed. Below are the nodes I use to normalize and visualize the z pass data (Viewer node is used to save the depth map).



Nodes to normalize and view



Z pass with Cycles render:



Z pass with Cycles render



Z pass with blender render (everywhere same value in the bottom part):



Z pass with blender render







cycles rendering blender-render render-passes






share|improve this question







New contributor




DingLuo is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.











share|improve this question







New contributor




DingLuo is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









share|improve this question




share|improve this question






New contributor




DingLuo is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









asked 13 hours ago









DingLuoDingLuo

262




262




New contributor




DingLuo is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.





New contributor





DingLuo is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.






DingLuo is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.











  • $begingroup$
    Read this related link: Precision of z coordinate
    $endgroup$
    – cegaton
    11 hours ago










  • $begingroup$
    Also related: Cycles generates distorted depth
    $endgroup$
    – cegaton
    11 hours ago
















  • $begingroup$
    Read this related link: Precision of z coordinate
    $endgroup$
    – cegaton
    11 hours ago










  • $begingroup$
    Also related: Cycles generates distorted depth
    $endgroup$
    – cegaton
    11 hours ago















$begingroup$
Read this related link: Precision of z coordinate
$endgroup$
– cegaton
11 hours ago




$begingroup$
Read this related link: Precision of z coordinate
$endgroup$
– cegaton
11 hours ago












$begingroup$
Also related: Cycles generates distorted depth
$endgroup$
– cegaton
11 hours ago




$begingroup$
Also related: Cycles generates distorted depth
$endgroup$
– cegaton
11 hours ago










1 Answer
1






active

oldest

votes


















3












$begingroup$

When rendering a Z pass you are essentially creating a depth map from the camera point of view. The issue here is there is potentially infinite range of distances to represent by colors and only 256 shades of gray available to map them to, in a traditional 8 bit image you have .



It can go from zero close to the camera (unlikely there is something this close) to whichever visible object is most most distant. But there may also be a sky or "background" at a theoretical infinite distance.



There are several possible ways to map these shades of grey to the distance progression each with its own advantages.



It can be a linear mapping where detail is distributed evenly across all image, but here may also be logarithmic mappings, emphasizing detail at certain parts of the picture.



  • You may want more detail at close range where image focus is likely to reside.

  • The scene may require more detail at large distances if you are rendering a landscape or distant view

  • You may want to use it for a mist pass requiring details at a medium range.

As far as I know would expect both Cycles and Blender Render to use the same "true distance to sensor", not a virtual orthographic plane passing through the sensor, but I may be wrong.



If that is indeed the case or you require a specific color progression or custom mapping of values you may construct your own "an artificial Z pass".



You can do so by making a basic emission shader with a circular black to white gradient mapped to the camera object.



Moving the camera should update the position. You can scale the gradient as desired to accommodate your desired distance range, and drive it through a Color Ramp for a non linear progression.






share|improve this answer











$endgroup$












    Your Answer





    StackExchange.ifUsing("editor", function ()
    return StackExchange.using("mathjaxEditing", function ()
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    );
    );
    , "mathjax-editing");

    StackExchange.ready(function()
    var channelOptions =
    tags: "".split(" "),
    id: "502"
    ;
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function()
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled)
    StackExchange.using("snippets", function()
    createEditor();
    );

    else
    createEditor();

    );

    function createEditor()
    StackExchange.prepareEditor(
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: false,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: null,
    bindNavPrevention: true,
    postfix: "",
    imageUploader:
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    ,
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    );



    );






    DingLuo is a new contributor. Be nice, and check out our Code of Conduct.









    draft saved

    draft discarded


















    StackExchange.ready(
    function ()
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fblender.stackexchange.com%2fquestions%2f134122%2fwhy-do-different-render-engines-generate-different-z-pass%23new-answer', 'question_page');

    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    3












    $begingroup$

    When rendering a Z pass you are essentially creating a depth map from the camera point of view. The issue here is there is potentially infinite range of distances to represent by colors and only 256 shades of gray available to map them to, in a traditional 8 bit image you have .



    It can go from zero close to the camera (unlikely there is something this close) to whichever visible object is most most distant. But there may also be a sky or "background" at a theoretical infinite distance.



    There are several possible ways to map these shades of grey to the distance progression each with its own advantages.



    It can be a linear mapping where detail is distributed evenly across all image, but here may also be logarithmic mappings, emphasizing detail at certain parts of the picture.



    • You may want more detail at close range where image focus is likely to reside.

    • The scene may require more detail at large distances if you are rendering a landscape or distant view

    • You may want to use it for a mist pass requiring details at a medium range.

    As far as I know would expect both Cycles and Blender Render to use the same "true distance to sensor", not a virtual orthographic plane passing through the sensor, but I may be wrong.



    If that is indeed the case or you require a specific color progression or custom mapping of values you may construct your own "an artificial Z pass".



    You can do so by making a basic emission shader with a circular black to white gradient mapped to the camera object.



    Moving the camera should update the position. You can scale the gradient as desired to accommodate your desired distance range, and drive it through a Color Ramp for a non linear progression.






    share|improve this answer











    $endgroup$

















      3












      $begingroup$

      When rendering a Z pass you are essentially creating a depth map from the camera point of view. The issue here is there is potentially infinite range of distances to represent by colors and only 256 shades of gray available to map them to, in a traditional 8 bit image you have .



      It can go from zero close to the camera (unlikely there is something this close) to whichever visible object is most most distant. But there may also be a sky or "background" at a theoretical infinite distance.



      There are several possible ways to map these shades of grey to the distance progression each with its own advantages.



      It can be a linear mapping where detail is distributed evenly across all image, but here may also be logarithmic mappings, emphasizing detail at certain parts of the picture.



      • You may want more detail at close range where image focus is likely to reside.

      • The scene may require more detail at large distances if you are rendering a landscape or distant view

      • You may want to use it for a mist pass requiring details at a medium range.

      As far as I know would expect both Cycles and Blender Render to use the same "true distance to sensor", not a virtual orthographic plane passing through the sensor, but I may be wrong.



      If that is indeed the case or you require a specific color progression or custom mapping of values you may construct your own "an artificial Z pass".



      You can do so by making a basic emission shader with a circular black to white gradient mapped to the camera object.



      Moving the camera should update the position. You can scale the gradient as desired to accommodate your desired distance range, and drive it through a Color Ramp for a non linear progression.






      share|improve this answer











      $endgroup$















        3












        3








        3





        $begingroup$

        When rendering a Z pass you are essentially creating a depth map from the camera point of view. The issue here is there is potentially infinite range of distances to represent by colors and only 256 shades of gray available to map them to, in a traditional 8 bit image you have .



        It can go from zero close to the camera (unlikely there is something this close) to whichever visible object is most most distant. But there may also be a sky or "background" at a theoretical infinite distance.



        There are several possible ways to map these shades of grey to the distance progression each with its own advantages.



        It can be a linear mapping where detail is distributed evenly across all image, but here may also be logarithmic mappings, emphasizing detail at certain parts of the picture.



        • You may want more detail at close range where image focus is likely to reside.

        • The scene may require more detail at large distances if you are rendering a landscape or distant view

        • You may want to use it for a mist pass requiring details at a medium range.

        As far as I know would expect both Cycles and Blender Render to use the same "true distance to sensor", not a virtual orthographic plane passing through the sensor, but I may be wrong.



        If that is indeed the case or you require a specific color progression or custom mapping of values you may construct your own "an artificial Z pass".



        You can do so by making a basic emission shader with a circular black to white gradient mapped to the camera object.



        Moving the camera should update the position. You can scale the gradient as desired to accommodate your desired distance range, and drive it through a Color Ramp for a non linear progression.






        share|improve this answer











        $endgroup$



        When rendering a Z pass you are essentially creating a depth map from the camera point of view. The issue here is there is potentially infinite range of distances to represent by colors and only 256 shades of gray available to map them to, in a traditional 8 bit image you have .



        It can go from zero close to the camera (unlikely there is something this close) to whichever visible object is most most distant. But there may also be a sky or "background" at a theoretical infinite distance.



        There are several possible ways to map these shades of grey to the distance progression each with its own advantages.



        It can be a linear mapping where detail is distributed evenly across all image, but here may also be logarithmic mappings, emphasizing detail at certain parts of the picture.



        • You may want more detail at close range where image focus is likely to reside.

        • The scene may require more detail at large distances if you are rendering a landscape or distant view

        • You may want to use it for a mist pass requiring details at a medium range.

        As far as I know would expect both Cycles and Blender Render to use the same "true distance to sensor", not a virtual orthographic plane passing through the sensor, but I may be wrong.



        If that is indeed the case or you require a specific color progression or custom mapping of values you may construct your own "an artificial Z pass".



        You can do so by making a basic emission shader with a circular black to white gradient mapped to the camera object.



        Moving the camera should update the position. You can scale the gradient as desired to accommodate your desired distance range, and drive it through a Color Ramp for a non linear progression.







        share|improve this answer














        share|improve this answer



        share|improve this answer








        edited 11 hours ago

























        answered 12 hours ago









        Duarte Farrajota RamosDuarte Farrajota Ramos

        33.9k53980




        33.9k53980




















            DingLuo is a new contributor. Be nice, and check out our Code of Conduct.









            draft saved

            draft discarded


















            DingLuo is a new contributor. Be nice, and check out our Code of Conduct.












            DingLuo is a new contributor. Be nice, and check out our Code of Conduct.











            DingLuo is a new contributor. Be nice, and check out our Code of Conduct.














            Thanks for contributing an answer to Blender Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid


            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.

            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fblender.stackexchange.com%2fquestions%2f134122%2fwhy-do-different-render-engines-generate-different-z-pass%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Can not update quote_id field of “quote_item” table magento 2Magento 2.1 - We can't remove the item. (Shopping Cart doesnt allow us to remove items before becomes empty)Add value for custom quote item attribute using REST apiREST API endpoint v1/carts/cartId/items always returns error messageCorrect way to save entries to databaseHow to remove all associated quote objects of a customer completelyMagento 2 - Save value from custom input field to quote_itemGet quote_item data using quote id and product id filter in Magento 2How to set additional data to quote_item table from controller in Magento 2?What is the purpose of additional_data column in quote_item table in magento2Set Custom Price to Quote item magento2 from controller

            Nissan Patrol Зміст Перше покоління — 4W60 (1951-1960) | Друге покоління — 60 series (1960-1980) | Третє покоління (1980–2002) | Четверте покоління — Y60 (1987–1998) | П'яте покоління — Y61 (1997–2013) | Шосте покоління — Y62 (2010- ) | Посилання | Зноски | Навігаційне менюОфіційний український сайтТест-драйв Nissan Patrol 2010 7-го поколінняNissan PatrolКак мы тестировали Nissan Patrol 2016рвиправивши або дописавши її

            Перекидне табло Зміст Переваги | Недоліки | Будова | Посилання | Навігаційне менюПерекидне таблоU.S. Patent 3 220 174U.S. Patent 3 501 761Split-flap-display