What is being done in VR displays now is not "vision correction", it's "distortion correction." Because the optics in consumer HMDs are so awful, the display distorts the image to compensate. The image is inversely distorted to counter the distortion in the optics. Yes, that looks "blurry" in the periphery since that's where the maximum distortion is applied.
What's discussed in that article is not vision correction via software. It appears to be a "coded-aperture mask" containing an array of pinholes that must be held in front of the display. The mask must be precisely tuned to each user (and likely each individual eye, in fact), and the position of the mask must be pixel accurate. In this case, the mask acts as a flat lens.
Beyond that, it's just hard to know what they are proposing on the software side since there's so little real information in that article and video. If you are interested, you can look up whatever paper(s) the researchers have produced on the topic. But don't expect some miraculous software-based vision correction from them any time soon. It still requires a "lens", just a mask-based one, and one with the disadvantage of it only working for one person (or one eye of one person), and making the display dimmer, since it works by blocking about half the light.
There was a very interesting idea for sunglasses a while back that's a little similar. Imagine that your sunglasses are black and white LCD displays. You then activate (make transparent) a single pixel and race it across the entire display in a raster fashion. This creates a moving pinhole, which can act to correct near or far vision (looking through a small pinhole, just about everything is equally sharp). This went no-where, but it's an example of making a simple mask-based imaging system, turning a flat filter (this time an LCD) into a lens of sorts. It could work well for a HMD, even, but at a severe cost in display brightness.