» Undistorting Projections
This site relies heavily on Javascript. You should enable it if you want the full experience. Learn more.

Undistorting Projections

images still to come

Undistorting Projections

Projection onto a flat surface

Your projection target is a flat surface, like a wall. Your content is unrelated to that flat surface: you just want to transform a part of that projection target into a rectangular screen... Using Homography (Transform 2d) you can easily distort the image you project, so that it appears undistorted on the surface. This always works: when the projector is rotated against the surface, its lens arbitrarily shifted, from any position.

Projection onto an arbitrary surface

You want to project onto an arbitrary surface like the facade of a house, a scultpure, a dome or anything else. Since the surface is not flat, a simple projection onto it looks distorted.
That is:

  • it wouldn't look distorted when viewing from the perspective of the projector
  • it looks distorted when viewing perpendicular onto a part of its surface.

Because what you really want is a surface-orientated projection, like if the image on that surface would be some property of that surface and the projector wouldn't even exist. Lets call that surface property a texture. You would expect that texture looks the same, independant from the angle towards some projector position - note that the projector is just the stupid hack to realize your wanted surface property.

But since you're stuck with that hack, in order for the projected image to appear undistorted on the surface, you need to distort your image according to all the features of the target surface.

Incredibly enough that distortion is theoretically quite easy to achieve: it can be realized by having an exact virtual replica of your real scene,

  • including the target projection surface as a 3d model
  • together with a replica of the real projector sharing orientation and lens-characteristics with the real projector.

By rendering the 3d model from the same viewpoint and using the same lens-characteristics that the real projector has, the resulting image will perfectly fit the projected surface. Any flat textures you give the 3d model will look undistorted on the real surface and therefore become the textures of the real architecture we earlier talked about.

Why it works? Think in rays.

In the real space a projector beams straight rays of light from ONE point (its ligth bulb) onto the different points of the architecture (lets concentrate in the drawing on some of them). We call it a projection of a 2d image into 3d space. Go to that projector position and look onto the scene or make a picture. What you see is based on the same principle but the other way around! There are rays of light coming from each point of the architecture to your eye/lens (same ONE point in space) making the 2d projection of the 3d scene. You can explain both with the same dawing, only by switching the direction of the ray arrows. It is the same or better the exact reverse.

abstract:

 Record = Anti-Project  and  Project = Anti-Record

with

 Record (3d) -> (2d)
 Project (2d) -> (3d) 

Praxis & Practice

If we render that exact scene in the virtual space from the same angle (but with the desired textual properties on the model) we get the same image that the real projector "sees", but with that additional textural information. By projecting the same way as we recorded the real model gets enriched with the surface textures of the virtual model.

This works perfectly - in theory - but chances are that your virtual scenes parameters will never exactly match your real world parameters, as some of them, like the projectors rotations, are hard to measure. But never mind, there begins the manual tweaking of those parameters and with some patience and practice you'll get quite good results.

To build a virtual copy of your 3d scene you should use

  • Projector (EX9) module which is modelled & parameterized after the properties of a real projector
  • GridEditor (EX9) for creating and modelling a rectangular surface

or

  • modelling software together with PointEditor (3D Persistent) to alter more complicated meshes

or

  • Build your mesh dynamically and paramterize in a way that you can tweak around without having to drag each point manually

Summary

We showed how to turn boring real objects into real objects with texture.
Others may still call it "undistorting projections", but indeed it is all about texturizing real objects.
So if you now have a complicated setup, think of what texture you want to have on your real architecture and then head for texturizing your virtual copy.

Texturizing Architecture

2d-Textures

One of the most astonishing things is when a sculpture with many faces is textured differently on each face. The real architecture orientated texturing paying respect to the aspects of the real geometry is a thrilling thing.

So, you need to 2d-texturize your virtual copy? Therefore you need to build a mesh with different sets of texture coordinates to be able to put the different textures onto the same model. Another solution is to export many meshes for the differently textured parts and render them one after the other, with different texture sources...

However keep in mind this kind of texturing is 2d Texture Coordinate Based.

semi-abstract

 Wallpaper: (Texture.xy) -> Color

3d-Textures

There is something like a 3d-Texture. Think of a tree which is textured at each point within its trunk; you just need to cut it to see that point.
3d-Textures can be loaded via FileTexture, but you also can create them on the fly via some tricky pixel shader code, spitting out a color dependant on the Position in Space.

semi-abstract

 Tree: (Position.xyz) -> Color

Projection of a 3d-Space

Imagine a Cylindrical 360°-Projection or a Dome projection, where you already rebuilt the screen and adjusted real and virtual projectors and all that. All that was just to undistort the projection, pay respect to the curved screen and the shift and orientation of the projectors, right?

A different kind of projection

But when it is about the content you don't want to pay respect to that boring screen. It is not like a sculpture which is interesting by its own. You want to turn it into a room again.

So we are talking about a different kind of projection here. A projection not related to real projectors, but to the 3d content we want dive into, unrelated of which displaying technology we use. It is more that we want to have something like an environment map of our content.

Imagine you stand inside a dome projection and look towards all sides from one point in space. What you want to see is a room, not the screen. So what we need is a projection from some the desired 3d scene to our 2d screen in a way, that if that is projected back to your eye, that you feel like you're in this room.
So what we actually need is a projection of the whole scene from ONE pivot point, so that we are able to look around and see one consistent 3d scene.
Theoretically this only works perfectly, when the viewer is at that pivot point from where we projected the content onto the screen, but what can you do, right? XD

Wrapping the head around it

So let's think of the whole setup these steps:

  • Content Scene: virtual light sources emmit light and make the scene visible
  • Content Scene: record the desired 3d scene into all directions (camera is free)
  • Setup Scene: use the same optics to project it onto the virtual screen (from one pivot point within the virtual copy)
  • Setup Scene: record from projectors perspectives (favorite displaying technology)
  • Real World: use the same projector perspectives to project to the real screen (favorite displaying technology)
  • Real World: project back into the eye (works best when at the pivot point)

At the end it is the same idea all the time. Shoot something and project it back. This time with the difference, that you shoot a 3d scene which doesn't exist in the real world (and vice versa the real world screen doesn't exist in the recorded virtual world) -- the captured, reprojected light rays now just travel as long as they hit the screen. This time not paying respect to the perspective of projectors but to the perspective of the viewer. This time with the ability to move around or rotate the camera in the content scene, since it is not connected to the other worlds.

Special Case: Slitscan for 360° cylindrical projection

One way to capture the virtual scene is by using a slitscan shader within your animation software and use that long flat video and put it as a 2d-texture onto your cylindrical model.
This simplifies everthing a little, but as gives less degrees of freedom. If you just use the 2d texture coordinates of your screen, no virtual projection is involved and the pivot point is always in the middle if the texture coordinates are regularly spreaded across the screen. So this is only half of the idea, but enables you to use slitscan video.

semi-abstract

 Wallpaper (Texture.xy) -> Color

As long as you can get your fingers onto several regular perspective views of the scene, which together hold all content needed, then use the perspective views and project them onto the virtual screen. Also the rendering time within an animation software will probably reduce drastically.

General Case: Do everthing in vvvv

When you have one computer per projection and also want to have generic 3d content, then you want to just record as much content as necessary for one projection. You can do that. And you don't need to do it slitscanwise. Just render the scene perspectively on all computers from the same pivot position (but differing rotation) and project them back to the virtual screen. No need to undistort anything again. The virtual rays do it for you.

Just take care that you

  • capture the 3d content scene with same optics as you use for projecting onto the virtual screen
  • all different virtual cameras look from the same point in space, so that we're still talking about the same rays

The optical laws somehow hold in virtual space.

Demod here:http://vvvv.org/tiki-download_file.php?fileId=1698small 360 degree setup.zip (35.95 Kb)</a>

semi-abstract

 Record (Cone, 3d scene) -> (2d Texture)
 Project (Cone, 2d screen in 3d) -> (2d Coordinates on Screen)
 Wallpaper (Texture.xy) -> Color

Examples

here the Content Scene = Virtual Scene ( = Real Scene anyway )
so the screen actually is what is
the architecture is the screen and but the content.

there is another record & back-project technique used in here however, it is needed for the perspective projection of light, resulting in shadows...:

  • the scene is recorded as a depth texture from the light position
  • this depth texture is back projected onto scene with the same perspective again, and a shader does the shadowing. look it up in the girlpower folder.

Lightstrive

also see
How To Project On 3D Geometry

Perspective Distortion

Note that the projection will only look perspectively correct when viewed from the one point in real world that corresponds to the virtual cameras position.
From all other points in real world the projected image will not look correct. But never mind, in most cases this isn't a big deal as our TV/cinema trained eyes and brains are quite tolerant with such visual challenges as long as our viewing position isn't too far off the actual position. see:http://users.skynet.be/J.Beever/pave.htm

auto calibration solution:http://www.domeprojection.com/

anonymous user login

Shoutbox

~1d ago

joreg: vvvvTv S0204 is out: Custom Widgets with Dear ImGui: https://youtube.com/live/nrXfpn5V9h0

~1d ago

joreg: New user registration is currently disabled as we're moving to a new login provider: https://visualprogramming.net/blog/2024/reclaiming-vvvv.org/

~9d ago

joreg: vvvvTv S02E03 is out: Logging: https://youtube.com/live/OpUrJjTXBxM

~11d ago

~13d ago

joreg: Follow TobyK on his Advent of Code: https://www.twitch.tv/tobyklight

~16d ago

joreg: vvvvTv S02E02 is out: Saving & Loading UI State: https://www.youtube.com/live/GJQGVxA1pIQ

~16d ago

joreg: We now have a presence on LinkedIn: https://www.linkedin.com/company/vvvv-group

~23d ago

joreg: vvvvTv S02E01 is out: Buttons & Sliders with Dear ImGui: https://www.youtube.com/live/PuuTilbqd9w

~30d ago

joreg: vvvvTv S02E00 is out: Sensors & Servos with Arduino: https://visualprogramming.net/blog/2024/vvvvtv-is-back-with-season-2/

~30d ago