QUOTE ""Intel is in development of drivers to support 4k/2k displays." as of 2014-04-06"
You answered your own question.
Drivers are almost never done in 2 or 3 months. I am certain that when the drivers are released, Intel will be tooting from the top of mountains that it is. And even when They say they are released, expect lots and lots of bugs.
In the mean time, why would you spend thousands of dollars on a 4K display to run things at about 3 FPS? You do know that 4K data is MASSIVE compared to 1080 and if you get 20 FPS with 1920x1080, at 4lk you might get 3-5 FPS. Intel graphics are really really weak. They get better with each CPU refresh, but they are still a very long ways behind what AMD and Nvidia can do today.
In order to use 4k effectively, you are going to need to put together a seriously heavy duty computer with 2-3 high end Nvidia video cards, setup SLI across them, and cross your fingers that everything works together because there are no real standards yet, and while some manufacturers are working together trying to make things work, some monitors only work if the 2 Displayport (DP) channels carry the whole image (half on each DP connection), other monitors now accept frankenstein HDMI connections, and some actually have new 4k chips in them that only work over DP 1.2. Right now, you would probably spend a good $1500 or so for a decent 4k panel, and then another $1500 to $3000 on video cards trying to get some decent frame rates on that panel. and again even after you spend all that money, you really dont know if the panel uses the same standards that the video cards do.
In a year or three, 4k displays will all follow a single standard, and all video card manufacturers should also be supporting the same standard. But that is NOT the case today.