BNPR Exclusive: Edge Nodes (Part 1)

This is part one of the edge nodes development diary. Here we are interviewing the 3 main developers about the nodes and how it all started. Here goes!

Well, Daniel, how it started?

Daniel Kreuter: When I started being interested in Non-Photorealistic Rendering a couple of months ago, the first thing I wanted to know was how to render lines as for example contours, folds and other types of edges. There were already some methods to achieve that but I wasn’t satisfied with any of them because they were on the one hand not accurate enough and also not very flexible.

Then what you do?

Daniel Kreuter: I started working on my own setup and one of the first things I found out was that normal pass holds most of the information necessary to detect edges. By blurring it and comparing it to the original by using a mix node set to ‘difference’ I was already able to get lines which weren’t that bad. If you’re interested in seeing that setup you can get the first attempt here: “Contour Rendering Node Setup”

After that I met Light BWK who is also interested in rendering edges and was working on a setup himself. We combined our ideas and started working on a new setup which should be able handle all kind of situations and should be as reliable as possible.

Nice. Thanks Daniel. Let’s hear it from Light’s perspective. Light, how it all started?

Light BWK: I started way earlier, Freestyle development wasn’t going anywhere (at least at that time), plus the load of rendering for Freestyle to produce lines was and still is too expensive for a new studio and I was looking for alternatives. There was a nice plugin for 3dsmax called Pencil+, it don’t use mesh data to render contour and outline. But I was not going to put that load of money just for few lines and interrupt the pipeline.

Pencil+ is exciting, yet what happened afterwards?

Light BWK: Then I saw Daniel’s Z pass and normal setup. It was a start but it didn’t impressed me (sorry Daniel). There was too many variables in the RGB curves nodes and color ramp nodes (you can’t set the nodes precisely twice). But it got us talking. Ideas were thrown at each other, blend files were swapped.

Did you guys found anything interesting?

Light BWK: Yes, we did. One blend file that got my thinker blinking was the use of normal with laplace filter to get edges. Wiki: Canny Edge Detector. It worked but it was ugly. The lines were minimum 3 pixels wide. I took the initial nodes, showed it to the guys in FreeStyle Collaboration FB group and said that “this might be an alternative for Freestyle.” Lee chipped in to do some render test. We concluded that if Freestyle can have the same data the resulting edge will have better definition, and that lit another imaginary LED.

Light BWK: The next few days, the node setup haunted me. There are few things I want to fix. One of it was to remove the color ramp that has black and white, so close they almost touching each other. My solution was very lame, I took the output of Laplace filter, blurred it, next separate RGB and using math node’s greater than on each channel, combine the RGB with add then feed it to the factor of a mix node to produce dark lines. For the first time the setup can define good intersection. The lines was still very thick.

I showed Daniel and Lee the setup and swapped the blend file. Daniel identify some problem. The setup produced too much noise. Lee was happily rendering, but the noise was very visible. The next few days, I took all my knowledge from Freestyle to build a system of node modules.

Thanks Light. Now your turn Lee. How was the testing?

Lee Posey: I had remembered a file with an old way I had tried to do edge detection 3 years ago. Trying to make a 2D character from a 3D one, I discovered different edge detection methods. My own testing gave me this for a result. I was painting uv textures and using the sobel filter to make a toon looking character. This proved to be tough and time consuming. Here is a character render:

Lee Posey: And here is the UV texture (not all of it) of the human character:

Lee Posey: There were big issues with this method, mainly for animating. Color regions will change what they touch making planning a nightmare!!  I did however create a successful simulation with it:

Cyclone from Jikz on Vimeo.

Lee Posey: Here is a sample image of the color of the cyclone:

Lee Posey: In this method, I needed to individually edit the material of each face or face group to make the Sobel detect edges properly. This proposed a challenge for the camera rotation in the video.

If you want to play with it, you can get it off of Blendswap:

Skip ahead 3 years to the present…

I was happily testing and coming up with crazy scenarios for the newly discovered node-based edges. While doing this I discovered you can use z transparency with an alpha of .5 giving optimal results to hidden lines. You can use this hidden line render and a render of only the visible lines plugged into a difference node to gain control of the colors of visible and hidden lines separately:

Nice render, I also heard that you have another setup, without Normal pass?

Lee Posey: Another highly experimental method of accomplishing lines is using a light rig lighting the scene with shadowless sunlamps or hemi lamps, from 3-6 directions. This introduces many complications and variables, but is a viable method if you REALLY need it. The advantage is that you can tweak individual lights to tune edges. This method again, is extremely fresh, and not highly tested. I will be looking more into it. the basic tenant of the idea is still the same, which is to produce color differences for the edge detection filter to [possibly] work more efficiently.

Thank you Lee for the many renders and priceless experience sharing.

Ending note: Close beta testing will start soon. Anyone who are interested can put a comment below to apply. Number of testers are limited as the development team can’t handle too many testers at one time. And the development team leave you with a nice line rendering of a model of Suzuki 2008 B-King by SPLATYPI

You may also like...

  • very cool

  • This is all pretty pretty pretty. I’ve experimented with different nodes as well; I use the Internal Edge rendering while I’m testing and working everything else out, and I’m not sure yet what I’ll end up using for finished animations (I don’t have any yet 🙁 ).

    • Light

      You should consider using this setup. It has better quality compared to Blender’s original edge. But if you want to do archviz, Freestyle is the better choice.

  • vicentecarro

    Great 😉

%d bloggers like this: