How to use ControlNet Inpaint: A Comparative Review of Three Processors

In this article, we will discuss the usage of ControlNet Inpaint, a new feature introduced in ControlNet 1.1. While Inpaint is also available in img2img, ControlNet’s version offers higher performance, especially in cases where regular Inpaint fails. Additionally, ControlNet Inpaint is compatible with the txt2img (t2i) screen, eliminating the need to switch to inpaint tab each time, leading to a more user-friendly experience.

By reading this article, you will find answers to the following questions:

  • The capabilities of ControlNet Inpaint
  • How to effectively utilize ControlNet Inpaint
  • Understanding the distinctions in Preprocessors

Capabilities of ControlNet Inpaint

ControlNet Inpaint offers the following features:

  • Hairstyle transformation
  • Clothing transformation
  • Object removal/erasure

To gain a better understanding of these capabilities, let’s examine some results achieved using ControlNet Inpaint. Later on, we will provide a detailed explanation of how to utilize ControlNet Inpaint effectively.

Hairstyle Transformation

The images below demonstrate the application of a hair area mask with the prompt “short bob hair.” ControlNet Inpaint can be employed to transform hairstyles.

Clothing Transformation

The images below illustrate the application of a clothing area mask with the prompt “t-shirts.” ControlNet Inpaint can be utilized to transform clothing.

Object Removal/Erasure

ControlNet Inpaint can also be applied to mask and remove/erase unwanted objects.

How to Use ControlNet Inpaint

Installing ControlNet

To use ControlNet Inpaint, you first need to install ControlNet. ControlNet Inpaint is a feature of ControlNet, which is an extension of the Stable Diffusion Web UI. If ControlNet is not yet installed on your system, please follow the instructions in the following article to install it.

What is ControlNet? What Can It Do? A Comprehensive Guide to Installing ControlNet on Stable Diffusion Web UI

Downloading the ControlNet Inpaint Model

To use ControlNet Inpaint, download the ControlNet Inpaint Model for ControlNet Inpaint. You will need to download two files from the link provided below and place them in the stable-diffusion-webui/models/ControlNet directory:

  • control_v11p_sd15_inpaint.pth
  • control_v11p_sd15_inpaint.yaml

lllyasviel/ControlNet-v1-1 at main (huggingface.co)

Usage Steps in Stable Diffusion Web UI (txt2img)

Follow these steps to use ControlNet Inpaint in the Stable Diffusion Web UI:

Open the ControlNet menu. (Step 1/3)


Extract the features for inpainting using the following steps. (Step 2/3)

  • Set an image in the ControlNet menu and draw a mask on the areas you want to modify.
  • Check the Enable option.
  • Select “Inpaint” as the Control Type.
  • Click the feature extraction button “💥” to extract the features.

Finally, generate the image using the txt2img procedure. (Step 3/3)

  • Enter prompts and negative prompts.
  • Click the “Generate” button to generate the image.

Usage Steps in Stable Diffusion Web UI (img2img, i2i)

The usage of img2img is explained in the following article. In the new version, it is more convenient to use img2img instead of txt2img.

Improved Convenience! Using ControlNet with img2img (i2i) and inpaint

Instructions for using ControlNet Inpaint

Changing Hairstyle

To add “bob short hair” to the image, follow these steps:

Prompt: 1girl, a 20-year-old pretty Japanese girl in a classroom, wearing a school uniform, standing in front of a blackboard with bob short hair

Draw a mask around the hair to extract its features.

The resulting images are displayed below. The left image is the reference image, and the right image is the inpainted image. It can be seen that the specified “bob short hair” in the prompt has been properly applied.

Changing Clothing

To change the clothing, draw a mask around the desired portion.

Prompt: 1girl, a 20-year-old pretty Japanese girl in a classroom, wearing a t-shirt, standing in front of a blackboard

Changing Clothing Color…struggles with color replacement

To convert the clothing color to white, enter the following prompt:

Prompt: 1girl, a 20-year-old pretty Japanese girl in a classroom, wearing a white t-shirt, standing in front of a blackboard

Although the settings remain the same, it doesn’t work well due to the influence of the blackboard in the background.

Attempts to adjust the prompt and switch the preprocessor did not yield satisfactory results. Normal inpainting also produced similar outcomes.

From these results, it is clear that inpainting is not suitable for color replacement. If you want to change the color of the clothing, consider using ControlNet Tile or Reference Only.

Object Removal

Let’s attempt to remove the objects displayed on the blackboard and create a clean background. Change the preprocessor to “inpaint_only+lama” as it is known to have better removal performance. Further explanation will be provided later.

The result is shown below. It can be observed that the objects displayed on the blackboard have been successfully removed.

About the Inpaint Preprocessor

The Inpaint Preprocessor in Inpaint offers three main functionalities: inpaint_only, inpaint_only+lama, and inpaint_global_harmonious.

Now, let’s compare the differences between these preprocessors using images. We’ll switch between preprocessors and observe the inpainting on a previous bob hair image.

  • Top left: Reference image
  • Top right: inpaint_only
  • Bottom left: inpaint_only+lama (which shows smoother blending compared to inpaint_only)
  • Bottom right: inpaint_global_harmonious (which makes changes not just within the masked area but also in the surrounding area)

About the inpaint_only preprocessor

The inpaint_only preprocessor allows you to input the masked area for further processing.

About the inpaint_global_harmonious preprocessor

The inpaint_global_harmonious method is used when you need to make modifications to the surrounding region of the mask for seamless image blending.

About the inpaint_lama preprocessor

The term inpaint_lama refers to a highly stable method utilized in the Inpaint Preprocessor. It is possibly named LaMa, which stands for Large Mask, based on the corresponding research method called “Resolution-robust Large Mask Inpainting with Fourier Convolutions.

For more information on inpaint_only+lama, you can refer to the Preprocessor: inpaint_only+lama page on the ControlNet GitHub repository.

The results from inpaint_only+lama usually looks similar to inpaint_only but a bit “cleaner”: less complicated, more consistent, and fewer random objects. This makes inpaint_only+lama suitable for image outpainting or object removal.

https://github.com/Mikubill/sd-webui-controlnet/discussions/1597

In the linked discussion, it is stated that the results obtained from inpaint_only+lama are generally similar to inpaint_only but appear cleaner, with less complexity, greater consistency, and fewer random objects. This makes inpaint_only+lama suitable for tasks such as image outpainting or object removal.

Frequently Asked Questions (FAQ)

How can I enlarge the Inpaint screen?

To enlarge the Inpaint screen, you can use ControlNet Inpaint from the img2img screen of img2img. This allows you to edit on a larger screen.

How can I enlarge the brush in the Inpaint screen?

To enlarge the brush in the Inpaint screen, follow these steps:

  1. Click on the brush icon located in the top right corner of the Inpaint screen.
  2. Adjust the brush size using the slider that appears.

コメント