Let's begin with ideal case - tensors perpendicular to each other
## Standart
`c` - conditioning(positive tensor)
`u` - unconditioning(negative tensor)
light blue - resulted noise prediction

## perp-neg
`p` - positive tensor
`n` - negative tensor
`u` - empty tensor
`w_n` - weight of negative tensor
`p' = p - u`
`n' = n - u`
light blue - resulted noise prediction with negative tensor weight=1
blue - resulted noise prediction with negative tensor weight=0.5
*//After calculation of result it's added to u(empty) tensor, but this a bit irrelevant here*

So, we can see that in standart conditioning we got different strength between positive increasing and negative decresaing:
| cfg | positive | negative | ratio |
| - | - | - | - |
| 1 | 1 | 0 | 0 |
| 2 | 2 | 1 | 0.5 |
| 3 | 3 | 2 | 0.(6) |
| 4 | 4 | 3 | 0.75 |
But in perp-neg it's same with any cfg, you can manage this ratio only by negative prompt/tensor weight.
Let's see in less ideal scenario - not perpendicular tensors:
## Standart:

## Perp-neg
`n'_p` - it's perpendicular from negative tensor to positive tensor

So, as we can see - difference that in perp-neg same part between positive and negative prompt eliminates and they not affect each other.
Table of tensors from example graphics:
standart:
| cfg | dissimmilar | perpendicular | similar |
| - | - | - | - |
| 1 | 3, 0 | 3, 0 | 3, 0 |
| 2 | 7, 4 | 6, 4 | 5, 4 |
| 3 | 11, 8 | 9, 8 | 7, 8 |
| 4 | 15, 12 | 12, 12 | 9, 12 |
perp-neg:
| cfg | neg_weight=0.5 | neg_weight=1 |
| - | - | - |
| 1 | 3, 2 | 3, 4 |
| 2 | 6, 4 | 6, 8 |
| 3 | 9, 6 | 9, 12 |
| 4 | 12, 8 | 12, 16 |