Compiler warnings are good, actually

There is a trend amoung new programming languages not to produce compiler warnings, instead favoring errors. There are a few problems with this.

Wased time

A common thing for compilers to complain about is unused variables. while unused variables aren't something you want in a finished product, they often appear when you try to test your partilly written code.

If it's just a warning, you can simply ignore it, as you are going to use that variable later. However, if it's an error, you have to go into your code, comment out the variable declaration, run the code, then uncomment the variable, just to satisfy the compiler.

This isn't just a one-time problem either. In the process of debugging, you might comment out a line of code that used a variable, only to have to do the same for the variable declaration

Combativeness

Unlike a warning, which is a mild annoyance at most, errors can feel like you are fighting the compiler. This may cause you to try and defeat the compiler.

Using the example of unused variables again, say a programmer got tired of commenting and uncommenting a line of code. Most programming languages have a way of "using" a value, mostly to get around errors like this. In many of them this is assinging the value to an underscore, as in _ = value.

Now, say this programmer does this, then say that they do forget to use the variable down the line. Unlike a warning, which might remind them of this (or at least inform their sucesssor), the error encorages programmers to reduce its usefulness.