Researchers from the University of Michigan, NetEase Fuxi AI Lab, and Beihang University in China recently introduced Stylized Neural Painter. Stylized Neural Painter is a novel automatic tool that can generate vivid and realistic artwork in controllable styles from images. Stylized Neural Painter works on generative modeling, image translation, and style transfer methods with strong neural networks.
This method takes the existing image-to-image translation methods to another level. The pre-existing way formulated the translation as a pixel-wise prediction or continuous optimization process in their pixel space. The new approach treats this creative process in a vectorized environment, producing a sequence of physically meaningful stroke parameters that can further be used for rendering. As a vector render is not differentiable, the team designed a neural renderer that imitates the vector renderer’s behavior, then frames the stroke prediction as a parameter searching process. It then maximizes the similarity between the input and the rendering output.
They simulate human painting behavior and generate vectorized strokes sequentially with a clear physical significance. These generated stroke vectors can be used for rendering with arbitrary output resolution. The researchers claim that their method can draw various painting styles like oil-paint brush, marker pen, watercolor ink, and tape art. The researchers redesigned the rendering process with a rasterization network to better handle the disentanglement of shape and color, unlike the previous neural renderers.
Researchers claim that they compared Stylized Neural Painter with its peers, such as Learning to paint and SPIRAL, which are both stroke-based image to painting translation methods that train RL agents. They found that Stylized Neural Painter generates more vivid results and more apparent distinction on brush textures after comparing.