# Is !(~A && ~B) better than (A||B) in programming?

I am developing in `Java` and I am using `IntelliJ` as my IDE. I wrote an `if` statement as follows.

``````if( list1.size() >= 1 || list2.contains(itemX) ) {
//do something
}
``````

`IntelliJ` suggested a transformation (`DeMorgan's Law`) and it transformed it to:

``````if( ! ( list1.size() < 1 && !( list2.contains(itemX) ) ) ) {
//do something
}
``````

So it applied a very common discrete mathematics theory on simplifying boolean expressions. What I am wondering is how does this optimize anything?

`||` operator anyways does not execute the whole condition if the first part is itself true, and only executes the RHS only if the first part is false.

Is the transformed condition effective? How?

0 Followers

Most reacted comment
1 Comment authors
Recent comment authors
Subscribe
Notify of
Guest

Both are exactly the same statements.

I agree that OR operator does not evaluate the second part if the first part is TRUE, however, it is also true that the AND operator does not evaluate the second part if the first part is FALSE.

In fact, it will take more time and space to evaluate the ~(~A && ~B) as opposed to A||B.

Hope this helps ðŸ™‚

Guest

This is somewhat subjective, but a good general rule of thumb is to remove as much complexity as possible. By complexity, I mean the number of operations you need to perform in order to obtain the desired result. In this sense, !a && !b is worse than !(a || b) because in one case you’re negating a and b, then performing the and operator resulting in 3 operations whereas in the latter case, you’re only performing 2. Of course this is vacuous when you’re talking about two conditions, but when you’re dealing with many, this can make a big difference.… Read more »