I think Apple is completely right about this. As of now, they currently have no way to override certain encryption and locking features they encourage their customers to use for privacy. Modern iPhones that need to be unlocked by passcode are essentially useless to the government in investigations (though the particular phone in the first major dispute was supposedly unlocked in the end by a third party), and this became a national issue in 2015/2016 when the FBI planned to get the courts to require Apple to give access to phones they do not have tools to currently unlock without the user. The government's argument was that in the wake of terror attacks in San Bernardino, the phone they recovered at the scene was no different than other data requests that tech companies are legally required to comply with as US businesses.
To this day, Apple hosts this response letter to the orders:
Apple has pointed out that the moment they are required to make such tools that break the privacy of their devices, there will be threats to their users around the world and in the USA. That activists, journalists, and citizens may end up targets of such tools under government direction.
And they're right. They should not be forced to end password privacy for devices in the hands of hundreds of millions of people.
Attorney General Barr is being dirty now (as Republicans are) claiming Apple isn't helping investigators against terrorism, but this is a lie. Apple has provided the data they have as the law requires, but they simply will not create a tool that unlocks the iPhone.
Tough on crime shouldn't mean fuck everyone's privacy too.