I wanted to know this because older graphics hardware that only supported DirectX 11 now supports DirectX 12 too. This was not the case between DirectX 10 and DirectX 11 (i.e. DirectX 10 hardware could not support DirectX 11 games).
What is the difference between DirectX 11 and 12? How is it possible that older hardware can now use DirectX 12?
As a gamer it doesn’t mean too much. The most obvious difference that DirectX 12 requires Windows 10, while DirectX 11 requires Windows 7 or later. DirectX 12 also requires that your video card driver supports it as well. This means you need to have a relatively recent AMD, NVIDIA or Intel video card with updated drivers.
In terms of its effect on games DirectX 12 doesn’t really change what can be displayed, it just allows for more efficient rendering. Its main improvement is that it lets more than one CPU core to submit commands to the graphic card at the same time. With DirectX 11 or earlier games were effectively limited to accessing the video card from only one CPU core of a multicore CPU at a time.
However the advantages of DirectX 12 aren’t easy for developers to exploit in practice. At this point, I don’t expect that many games will be able make effective use of it. For the most part, only AAA games would have both the resources and the need to usefully exploit DirectX 12.
Since DirectX 12 doesn’t really add new rendering functionality, it just changes the how games access the video card, it’s possible to support it with older hardware simply by updating the drivers.
(To be a bit more technical, Direct3D 12 requires that the driver be updated to use WDDM 2.0 and that the hardware supports at least feature level 11_0. The newer feature levels 12_0 and 12_1, mostly affect how games can access graphics resources. The limited additional hardware requirements meant that some older “DirectX 11” hardware was able to support the newer 12_0 level.)