Wednesday, August 10, 2005

How holographic 3D displays should work


It should be possible to make a flat display which would produce true 3D objects appear behind it as well as in front of it, without any kind of glasses or such. I will try to suggest how exactly it should be done. Requirements: understanding basic school physics including wave optics, how interference works, etc.

[1] Brief explanation about conventional holograms
Conventional holograms, unlike regular photographs which are just projection of the world on some kind of surface, can actually store and display 3D images. I will try to explain briefly how it works.

We usually see 3D world around us because we can take a look from different viewpoints and see slightly different things. That's basically why we have two eyes - to get information about depth with a trick of parallax. Each of the eyes itself is not a pinhole camera as well, and it can also sense distance with though smaller precision by means of 'autofocus'. This is by the way the reason why carnivores have two eyes in frontmost part of their face while herbivores have them shifted to the sides. Hunting requires estimating precise distance while for running away early enough wider viewfield is preferred over the parallax.

We can, obviously, see a 3D pig even thru a window (which is a transparent flat and framed piece of nothing for all practical purposes). The light passing thru this plane therefore contains all 3D information about the pig on the other side. If we could record it (XYT=XYPlane*Time) and reproduce it, we're done. So, X*Y*T instead of, well, X*Y*Z should be good enough, which is the big idea behind holography. All properties light has passing thru the window, from a wave point of view are electric field components at every moment. (In fact, one component and it's phase would appear to be enough).

For a static objects behind the window it doesn't sound very reasonable that we need to record the emerging light for all infinite time. Obviously, shorter periods should be enough. How short? 1/frequency, or the time it takes light to advance a full wavelength should about suffice.

How to do it? There comes in the ingenious Gabor's trick to catch a complete snapshot of the whole a-few-wavelength-wide plate by means of interference with a reference beam and also to reproduce the same light configuration with another reference beam. For more information, go there.

[2] So how do we do this with a programmable display?
We need to control light amplitude and phase at every point of the suggested display plane to make them same as they would for imaginary "window" in front of the pig. Computing these phases would be, by the way, a huge computational task. But we'll fall from that bridge when we get there. Let's first think how we produce the required phases and amplitudes on sub-wavelength scales.

[3] Dealing with phases.
With silicon chip manufacture procedures we can produce sub-light-wavelength structures for a long time already. Pentium4 uses a 90nm process, about 8times smaller than red wavelength. Some fiber structure should be made to bring coherent laser light into every location of the chip to create two types of pixels: "straight" and "90 degrees phase shifted" of the same wavelength at one place. Those will be blocked with tiny LCD crystals (LCD-on-Silicon) to create a linear combination of the two. With such, we can create any phase relative to the source at this pixel. Correction: we need 3 pixels of the same wavelength, because LCD can produce only positive coefficients (letting more or less light thru) and negative coefficients would be required for the basis of two. So in fact, 120deg rotated 3 pixels in a matrix, times 3 for RGB, totally 9 elements for each "pixel". It's not really a pixel, it doesn't map to anything in the actual image. It's only a display element, interference of all of which are required to produce each and every voxel of the resulting pig).

Perhaps it would be possible to produce heavy computing (3D->phases) on the same chip structure and get 3D layout directly as an input.

[4] Possible applications
With the same technique we could produce objects on the other, our side of the window. Obviously we can, because lenses can create such images and we can simulate any beam of light. Being so, it could be used to create room walls tiled with such displays to produce any optical setup inside the room, create virtual personalities for 3D online conferences, etc, startrek-like stuff, true virtual reality.
On the other hand, you can create concealing boxes which would create exact 3D background as if the box is not there, an invisible man suit (blind like all invisible men, of course) and other kinds of fun.

I heard recently that this firm has tried to use a similar holographic assembly technique to create wall projector 2D monitors which do not have to be focused (sharp from every distance, as if they were pinhole camera projectors) with no lenses, clearly a good step in direction of real 3D holographic monitors.

Welcome to HowStuffShouldWork

Everybody knows http://www.howstuffworks.com. It used to be kinda stupid site for kids once, which would tell you that "a microwave owen uses microwave radiaiton to heat things" instead of "magnetron is a vaccum tube with thin cathode and cylindric anode with cavities in the uniform magnetic field parallel to the axis, ...". However, it is now a much better place (if you have a good popup blocker).

Even though they cannot explain you how color tv works (http://electronics.howstuffworks.com/tv10.htm) (In fact, I have searched for it to say "they can even explain you why color CRT TV electron cannon(s) never miss blue dot to hit red instead, although it frequently misses the whole screen by 2 cm" but, unfortunately, they confused all of it). Anyway - howstuffworks is a great site, and I won't tell you about color TV.

The purpose of this is not to explain how things work. Instead, we'll talk about how they should have worked. In other words - let's talk about the world's current problems in all areas technology - communications, transport, vehicles, spaceflight, computer hardware and software industry, etc. I'm going to write about those of my ideas which I don't have enough lifetimes (or other resources) to implement myself. All others go to Private Teramips Ideas Forum :)