I honestly did not understand what he meant by that.
when you do graphics programming, you speak with a specific language to talk to your GPU. To make games interoperable between GPU manufacturers, they support a common, agreed upon set of standard features, called a "core profile." In order to be, for example, Vulkan or DirectX 12 compliant, a GPU must adhere to a set of core standards regarding what kind of language features their card supports.
Beyond this core profile are what are known as extensions. These are specific additional features that cards have specific to their build. Not only are there vendor-specific extensions, like extensions that only exist on Nvidia or AMD, but there are even architecture-specific extensions, features that only exist on certain ranges of cards from those GPU makers. Example, many Nvidia architectures can support DirectX 12, but only turing cards can access their RTX extensions which handle raytracing in hardware.
What they said in their reply, is that the demo was built to ignore everything except the base standard. So no Nvidia or no AMD specific extensions, to give a better example of what the tech will look like across all hardware, instead of targeting specific extensions that (largely) get ignored in larger applications anyways. By using those specific extensions, they could push things even harder, and those extensions are inevitably exposed to the programmer, but the demo was built to explicitly ignore them.