Home  >  Article  >  Technology peripherals  >  A few lines of code stabilize UNet! Sun Yat-sen University and others proposed the ScaleLong diffusion model: from questioning Scaling to becoming Scaling

A few lines of code stabilize UNet! Sun Yat-sen University and others proposed the ScaleLong diffusion model: from questioning Scaling to becoming Scaling

PHPz
PHPzforward
2024-03-01 10:01:02522browse

In the standard UNet structure, the scaling coefficient 几行代码稳定UNet ! 中山大学等提出ScaleLong扩散模型:从质疑Scaling到成为Scaling on the long skip connection is generally 1.

However, in some well-known diffusion model work, such as Imagen, Score-based generative model, and SR3, etc., they all set 几行代码稳定UNet ! 中山大学等提出ScaleLong扩散模型:从质疑Scaling到成为Scaling , and found that such a setting can effectively accelerate the training of the diffusion model.

几行代码稳定UNet ! 中山大学等提出ScaleLong扩散模型:从质疑Scaling到成为Scaling

几行代码稳定UNet ! 中山大学等提出ScaleLong扩散模型:从质疑Scaling到成为Scaling

几行代码稳定UNet ! 中山大学等提出ScaleLong扩散模型:从质疑Scaling到成为Scaling

##Question Scaling

However, Imagen and other models have limited support for skip connection. There is no specific analysis of the Scaling operation in the original paper, but it is said that this setting will help speed up the training of the diffusion model.

First of all, this kind of empirical display makes us unclear about what role this setting plays?

In addition, we don’t know whether we can only set 几行代码稳定UNet ! 中山大学等提出ScaleLong扩散模型:从质疑Scaling到成为Scaling, or can we use other constants?

Are the "status" of skip connections at different locations the same? Why use the same constant?

The author has a lot of questions about this...

几行代码稳定UNet ! 中山大学等提出ScaleLong扩散模型:从质疑Scaling到成为ScalingPicture

Understanding Scaling

Generally speaking, compared with ResNet and Transformer structures, UNet is not "deep" in actual use, and is less prone to optimizations such as gradient disappearance that are common in other "deep" neural network structures. question.

In addition, due to the particularity of the UNet structure, shallow features are connected to deep locations through long skip connections, thus further avoiding problems such as gradient disappearance.

Then think about it the other way around, if such a structure is not paid attention to, will it lead to problems such as excessive gradients and parameter (feature) oscillation due to updates?

几行代码稳定UNet ! 中山大学等提出ScaleLong扩散模型:从质疑Scaling到成为ScalingPicture

#By visualizing the characteristics and parameters of the diffusion model task during the training process, it can be found that there is indeed instability Phenomenon.

The instability of parameters (features) affects the gradient, which in turn affects parameter updates. Ultimately this process has a greater risk of undesirable interference with performance. Therefore, we need to find ways to control this instability.

Further, for the diffusion model. The input of UNet is a noisy image. If the model is required to accurately predict the added noise, this requires the model to have strong robustness to the input against additional disturbances.

几行代码稳定UNet ! 中山大学等提出ScaleLong扩散模型:从质疑Scaling到成为Scaling

Paper: https://arxiv.org/abs/2310.13545

Code: https://github.com/sail-sg /ScaleLong

Researchers found that the above problems can be uniformly alleviated by scaling on the Long skip connection.

几行代码稳定UNet ! 中山大学等提出ScaleLong扩散模型:从质疑Scaling到成为Scaling

From theorem 3.1, the oscillation range of the middle layer feature (the width of the upper and lower bounds) is directly related to the sum of the squares of the scaling coefficient. Appropriate scaling coefficients help alleviate feature instability.

However, it should be noted that if the scaling coefficient is directly set to 0, the shock is indeed optimally alleviated. (Manual dog head)

But if UNet degrades to a skip-less situation, the instability problem is solved, but the representation ability is also lost. This is a trade-off between model stability and representational capabilities.

几行代码稳定UNet ! 中山大学等提出ScaleLong扩散模型:从质疑Scaling到成为ScalingPicture

Similarly, from the perspective of parameter gradient. Theorem 3.3 also reveals that the scaling coefficient controls the magnitude of the gradient.

几行代码稳定UNet ! 中山大学等提出ScaleLong扩散模型:从质疑Scaling到成为ScalingPicture

Further, Theorem 3.4 also reveals that scaling on the long skip connection can also affect the robustness of the model to input disturbances. bounds to improve the stability of the diffusion model to input disturbances.

Become Scaling

Through the above analysis, we understand the importance of scaling on Long skip connection for stable model training , 几行代码稳定UNet ! 中山大学等提出ScaleLong扩散模型:从质疑Scaling到成为Scaling is also applicable to the above analysis.

Next, we will analyze what kind of scaling can have better performance. After all, the above analysis can only show that scaling is good, but it cannot determine what kind of scaling is the best or better. good.

A simple way is to introduce a learnable module for long skip connection to adaptively adjust scaling. This method is called Learnable Scaling (LS) Method. We use a SENet-like structure, as shown below (the U-ViT structure considered here is very well organized, like!)

几行代码稳定UNet ! 中山大学等提出ScaleLong扩散模型:从质疑Scaling到成为ScalingPicture

Judging from the results of this article, LS can indeed effectively stabilize the training of the diffusion model! Further, we try to visualize the coefficients learned in LS.

As shown in the figure below, we will find that these coefficients show an exponential downward trend (note that the first long skip connection here refers to the connection connecting the first and last ends of UNet), and the first coefficient is almost close to Yu 1, this phenomenon is also amazing!

几行代码稳定UNet ! 中山大学等提出ScaleLong扩散模型:从质疑Scaling到成为ScalingPicture

Based on this series of observations (please refer to the paper for more details), we further proposed the Constant Scaling (CS) Method , that is, no learnable parameters are required:

几行代码稳定UNet ! 中山大学等提出ScaleLong扩散模型:从质疑Scaling到成为Scaling

CS strategy is the same as the original scaling operation using 几行代码稳定UNet ! 中山大学等提出ScaleLong扩散模型:从质疑Scaling到成为Scaling without additional parameters, resulting in almost no additional computational cost.

Although CS does not perform as well as LS in stable training most of the time, it is still worth a try for the existing 几行代码稳定UNet ! 中山大学等提出ScaleLong扩散模型:从质疑Scaling到成为Scaling strategies.

The implementation of the above CS and LS is very simple and only requires a few lines of code. For each (hua) formula (li) and each (hu) type (shao) UNet structure, the feature dimensions may need to be aligned. (Manual dog head 1)

几行代码稳定UNet ! 中山大学等提出ScaleLong扩散模型:从质疑Scaling到成为Scaling

Recently, some follow-up work, such as FreeU, SCEdit and other work, have also revealed the importance of scaling on skip connection. Everyone is welcome to try and promote it.

The above is the detailed content of A few lines of code stabilize UNet! Sun Yat-sen University and others proposed the ScaleLong diffusion model: from questioning Scaling to becoming Scaling. For more information, please follow other related articles on the PHP Chinese website!

Statement:
This article is reproduced at:51cto.com. If there is any infringement, please contact admin@php.cn delete