Yes, he said that they are close to L5. Here are two relevant quotes:
"I’m extremely confident that level 5 or essentially complete autonomy will happen and I think will happen very quickly,” Musk said in remarks made via a video message at the opening of Shanghai’s annual World Artificial Intelligence Conference (WAIC).
“I remain confident that we will have the basic functionality for level 5 autonomy complete this year.”
"
Tesla 'very close' to level 5 autonomous driving technology, Musk says
The reason I think it is a bad thing is because it implies that your self-driving is not very good. After all, the basic idea of self-driving is that the car is able to drive without any human input. Thus, every disengagement potentially indicates when your car failed at being self-driving. You obviously want your self-driving car to need human input as infrequently as possible.
To give you a point of comparison, Waymo and Cruise have a disengagement rate of 1 per 10,000 miles. So 1 per a few huindred miles would be much less reliable self-driving.
Now it is worth noting that not all disengagements are caused by the same thing. You can have disengagements that are not safety related. For example, the safety driver did not like what the autonomous car was doing but there was no safety issue. And you can have safety related disengagements where if the safety driver had not intervened, there was a high probability of an accident. The Waymo/Cruise rate includes all disengagements, both safety and non safety related.
It is also worth comparing the safety disengagement rate to the rate of accidents by human drivers. On average, human drivers in the US have a car accident, 1 every ~533,666 miles. So if we expect our self-driving to be as safe as human drivers, we would expect an autonomous car to only have a safety disengagement approximately every 533,666 miles as well.
Again, that gives us an idea that if an autonomous car is having a safety related disengagement every few hundred miles, then it is several thousands times less safe than a human driver. Therefore, it is a very bad disengagement rate.
Lastly, SAE L3 means that the car is fully self-driving in some conditions but must notify the driver in advance when the driver must take over. So L3 assumes that any disengagements are predictable since the car has to know in advance that they are going to be needed. So L3 cannot have surprise disengagements where the safety driver had to jump in at the last second to intervene. So if your autonomous car is having surprise disengagements every few hundred miles, then it is not Level 3.