Jump to content

HoloVID

From Wikipedia, the free encyclopedia

HoloVID is a measuring instrument, originally developed by Mark Slater for the holographic dimensional measurement of the internal isogrid structural webbing of the Delta family of launch vehicles in 1981.

History

[edit]

Delta launch vehicles were produced by McDonnell Douglas Astronautics until the line was purchased by Boeing. Milled out of T6 Aluminum on 40-by-20-foot (12 by 6 m) horizontal mills, the inspection of the huge sheets took longer than the original manufacturing. It was estimated that a real time in situ inspection device could cut costs so an Independent Research and Development (IRAD) budget was generated to solve the problem. Two solutions were worked simultaneously by Mark Slater: a photo-optical technique utilizing a holographic lens and an ultrasonic technique utilizing configurable micro-transducer multiplexed arrays.

A pair of HoloVIDs for simultaneous frontside and backside weld feedback was later used at Martin Marietta to inspect the long weld seams which hold the External Tanks of the Space Shuttle together. By controlling the weld bead profile in real time as it was TIG generated, an optimum weight vs. performance ratio could be obtained, saving the rocket engines from having to waste thrust energy while guaranteeing the highest possible web strengths.

Usage

[edit]

Many corporations (Kodak, Immunex, Boeing, Johnson & Johnson, The Aerospace Corporation, Silverline Helicopters, and others) use customized versions of the Six Dimensional Non-Contact Reader w/ Integrated Holographic Optical Processing for applications from supercomputer surface mount pad assessment to genetic biochemical assay analysis.

Specifications

[edit]

HoloVID belongs to a class of sensor known as a structured-light 3D scanner device. The use of structured light to extract three-dimensional shape information is a well known technique.[1][2] The use of single planes of light to measure the distance and orientation of objects has been reported several times.[3][4][5]

The use of multiple planes[6][7][8] and multiple points[9][10] of light to measure shapes and construct volumetric estimates of objects has also been widely reported.[11]

The use of segmented phase holograms to selectively deflect portions of an image wavefront is unusual. The holographic optical components used in this device split tessellated segments of a returning wave front in programmable bulk areas and shaped patches to achieve a unique capability, increasing both the size of an object which can be read and the z-axis depth per point which is measurable, while also increasing the simultaneous operations possible, which is a significant advance in the previous state of art.

Operational modes

[edit]

A laser beam is made to impinge onto a target surface. The angle of the initially nonlinear optical field can be non-orthogonal to the surface. This light beam is then reflected by the surface in a wide conical spread function which is geometrically related to the incidence angle, light frequency, wavelength and relative surface roughness. A portion of this reflected light enters the optical system coaxially, where a 'stop' shadows the edges. In a single point reader, this edge is viewed along a radius by a photodiode array.

The output of this device is a boxcar output where the photodiodes are sequentially lit diode-by-diode as the object distance changes in relation to the sensor, until either no diodes are lit or all diodes are lit. The residual product charge dynamic value in each light diode cell is a function of the bias current, the dark current and the incident ionizing radiation (in this case, the returning laser light).

In the multipoint system, the HoloVID, the cursor point is acousto-optically scanned in the x-axis across a monaxial transformer. A monaxial holographic lens collects the wave front and reconstructs the pattern onto the single dimensional photodiode array and a two dimensional matrix sensor. Image processing of the sensor data derives the correlation between the compressed wave front and the actual physical object.

References

[edit]
  1. ^ Agin, Gerald J. (February 1979). "Real Time control of a Robot with a Mobile Camera" (Document). SRI International, Artificial Intelligence Center. Technical note 179.
  2. ^ Bolles, Robert C.; Fischler, Martin A. (24 August 1981). "A RANSAC-based approach to model fitting and its application to finding cylinders in range data". Proceedings of the 7th International Joint Conference on Artificial Intelligence. Vol. 2. pp. 637–643.
  3. ^ Posdamer, J. L.; Altschuler, M. D. (January 1982). "Surface Measurement by Space-encoded Projected Beam Systems". Computer Graphics and Image Processing. 18 (1): 1–17. doi:10.1016/0146-664X(82)90096-X.
  4. ^ Popplestone, R. J.; Brown, C. M.; Ambler, A. P.; Crawford, G. F. (3 September 1975). "Forming Models of Plane-and-Cylinder Faceted Bodies from Light Stripes" (PDF). Proceedings of the 4th International Joint Conference on Artificial Intelligence. Vol. 1. pp. 664–668.
  5. ^ Oshima, Masaki; Shirai, Yoshiaki (April 1983). "Object Recognition Using Three-Dimensional Information" (PDF). IEEE Transactions on Pattern Analysis and Machine Intelligence. 5 (4): 353–361. doi:10.1109/TPAMI.1983.4767405. PMID 21869120. S2CID 17612273. Archived from the original (PDF) on 2016-10-19.
  6. ^ Albus, J.; Kent, E.; Nashman, M.; Mansbach, P.; Palombo, L.; Shneier, M. (22 November 1982). "Six-Dimensional Vision System". In Rosenfeld, Azriel (ed.). Proceedings of the SPIE: Robot Vision. Vol. 0336. pp. 142–153. Bibcode:1982SPIE..336..142A. doi:10.1117/12.933622. S2CID 64868995.
  7. ^ Okada, S. (1973). "Welding machine using shape detector". Mitsubishi-Denki-Giho (in Japanese). 47 (2): 157.
  8. ^ Taenzer, Dave (1975). "Progress Report on Visual Inspection of Solder Joints" (Document). Massachusetts Institute of Technology, Artificial Intelligence Lab. Working Paper 96.
  9. ^ Nakagawa, Yasuo (22 November 1982). "Automatic Visual Inspection Of Solder Joints On Printed Circuit Boards". In Rosenfeld, Azriel (ed.). Proceedings of the SPIE: Robot Vision. Vol. 0336. pp. 121–127. Bibcode:1982SPIE..336..121N. doi:10.1117/12.933619. S2CID 109280087.
  10. ^ Duda, R. O.; Nitzan, D. (March 1976). "Low-level processing of registered range and intensity data" (Document). SRI International, Artificial Intelligence Center. Technical note 129.
  11. ^ Nitzan, David; Brain, Alfred E.; Duda, Richard O. (February 1977). "The Measurement and Use of Registered Reflectance and Range Data in Scene Analysis". Proceedings of the IEEE. Vol. 65. pp. 206–220. doi:10.1109/PROC.1977.10458. S2CID 8234002.