Games are vary rarely made on
reference monitors. Likely some high end AAA games are, but we'd probably be talking games like The Last of Us Part 2. Certainly however, not games that end up on the Switch. Outside of some weird specifics (like how DOS VGA games were displayed as a 6bit palette despite being 8bit in memory), colours aren't really the thing that developers expect to be 100% accurate. Because aside from anything else, it's just not feasible to assume everyone can view your game on an identical display.
It's nice that you'd want to go through the effort to configure that but your average developer isn't really going to be all that fussed about the Switch colour profile.
You don't need a reference monitor. Most PC monitors either have an sRGB mode, or are natively sRGB.
Most televisions these days have BT.709 color (same gamut as sRGB) if you switch them to a more accurate preset like "cinema" mode, rather than "vivid."
Of course a reference display is
best, but so long as you aren't working on a super inaccurate wide gamut display/mode, it should mostly translate to other screens.
But there are many games which do look awful in their default state.
I have a strong suspicion that many Gen 7 games were designed on wide-gamut displays, since it was common back then for televisions to lack a good color-accurate preset, or even have color management systems capable of accurate color.
And I don't see the need to police what mode people use.
Inform people that standard mode is the most-accurate, and displays color as it should look, but if people want to use vivid mode, let them.
I will push back against people trying to say that vibrant mode is best because it's "what Nintendo intended" or anything like that, however.