US20090225333A1 - System aspects for a probe system that utilizes structured-light - Google Patents

System aspects for a probe system that utilizes structured-light Download PDF

Info

Publication number
US20090225333A1
US20090225333A1 US12/249,513 US24951308A US2009225333A1 US 20090225333 A1 US20090225333 A1 US 20090225333A1 US 24951308 A US24951308 A US 24951308A US 2009225333 A1 US2009225333 A1 US 2009225333A1
Authority
US
United States
Prior art keywords
light
measurement mode
probe system
image
inspection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US12/249,513
Other versions
US8107083B2 (en
Inventor
Clark Alexander Bendall
Kevin George Harding
Thomas Karpen
Guiju Song
Li Tao
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Baker Hughes Oilfield Operations LLC
Original Assignee
General Electric Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US12/042,821 external-priority patent/US7821649B2/en
Priority to US12/249,513 priority Critical patent/US8107083B2/en
Application filed by General Electric Co filed Critical General Electric Co
Assigned to GENERAL ELECTRIC COMPANY reassignment GENERAL ELECTRIC COMPANY ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KARPEN, THOMAS, HARDING, KEVIN GEORGE, BENDALL, CLARK ALEXANDER, SONG, GUIJU, TAO, LI
Publication of US20090225333A1 publication Critical patent/US20090225333A1/en
Priority to EP09172042.5A priority patent/EP2175231B1/en
Priority to CN2009102065334A priority patent/CN101726263B/en
Priority to US13/100,826 priority patent/US8422030B2/en
Priority to US13/334,239 priority patent/US8976363B2/en
Publication of US8107083B2 publication Critical patent/US8107083B2/en
Application granted granted Critical
Assigned to BAKER HUGHES OILFIELD OPERATIONS, LLC reassignment BAKER HUGHES OILFIELD OPERATIONS, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: GENERAL ELECTRIC COMPANY
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
    • G01N21/954Inspecting the inner surface of hollow bodies, e.g. bores
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
    • G01N21/954Inspecting the inner surface of hollow bodies, e.g. bores
    • G01N2021/9542Inspecting the inner surface of hollow bodies, e.g. bores using a probe
    • G01N2021/9544Inspecting the inner surface of hollow bodies, e.g. bores using a probe with emitter and receiver on the probe

Definitions

  • the subject matter described herein relates generally to borescopes and endoscopes, and more particularly, to a borescope/endoscope which provides 3D surface mapping and dimensional measurement.
  • Borescopes and endoscopes are typically used for inspection inside a remote cavity.
  • Most borescopes/endoscopes referred to herein as a probe, employ an external light source coupled to fiber optic bundles in the probe to provide illumination of a remote object or surface at the distal end.
  • an internal image is formed by a lens system on an image sensor, and the image is relayed to a connected display, such as a video screen.
  • the image sensor may be located at the proximal end of the probe, as with an optical rigid borescope or fiberscope, or at the distal end as with a video borescope or endoscope.
  • Such systems are often used to inspect inaccessible locations for damage or wear or to verify that parts have been properly manufactured or assembled.
  • Phase-shift technology is well suited to addressing these measurement needs, but its implementation in a borescope or endoscope presents numerous system-level challenges. It is desirable to address these challenges in a manner that yields a reliable and easy-to-use system.
  • a probe system comprises an imager and an inspection light source.
  • the probe system is configured to operate in an inspection mode and a measurement mode.
  • inspection mode the inspection light source is enabled.
  • measurement mode the inspection light source is disabled, and a structured-light pattern is projected.
  • the probe system is further configured to capture at least one measurement mode image. In the at least one measurement mode image, the structured-light pattern is projected onto an object.
  • the probe system is configured to utilize pixel values from the at least one measurement mode image to determine at least one geometric dimension of the object.
  • a probe system comprises an imager, and the probe system is configured to operate in an inspection mode and a measurement mode. Diffuse illumination light is projected during inspection mode, and a structured-light pattern is projected during measurement mode. The probe system is further configured to capture at least one measurement mode image. In the at least one measurement mode image, the structured-light pattern is projected onto an object. The probe system is configured to utilize pixel values from the at least one measurement mode image to determine at least one geometric dimension of the object. The probe system is also configured to detect relative movement between a probe and the object between captures of two or more of a plurality of images.
  • FIG. 1 is a schematic diagram of a borescope/endoscope system in accordance with an embodiment of the present invention.
  • FIG. 2 is a graph showing the trajectory of an exemplary projection set projected from one side of the FOV.
  • FIG. 3 is a graph showing the trajectory of the structured-lines of one fringe set in each of a first and second exemplary projection set relative to a field of view.
  • FIG. 4 is a flow chart illustrating an exemplary embodiment of the steps involved in motion detection.
  • FIG. 5 is flow chart illustrating an exemplary alternative embodiment of the steps involved in motion detection.
  • FIG. 6 is a flow chart illustrating an exemplary embodiment of the steps involved during an image capture sequence of the present invention.
  • An insertion tube 40 comprises elongated portion 46 and detachable distal tip 42 .
  • Elongated portion 46 comprises a main long, flexible portion, a bending neck, and a camera head.
  • Delineation line 41 shows where the camera head starts on elongated portion 46 .
  • the camera head of elongated portion 46 typically includes at least imager 12 , electronics 13 , and probe optics 15 .
  • Detachable distal tip 42 typically attaches to the camera head of elongated portion 46 , mentioned above.
  • Detachable distal tip 42 contains viewing optics 44 which are used in combination with probe optics 15 to guide and focus light received from the viewed surface or object (not shown) onto imager 12 .
  • the elements shown in tip 42 could alternatively be located on elongated portion 46 . These elements include viewing optics 44 , at least one emitter module 37 , at least one intensity-modulating element 38 , and light passing element 43 .
  • the at least one light emitter module 37 comprising a plurality of light emitters, could be fixedly attached to insertion tube 40 while the at least one intensity-modulating element is disposed on detachable tip 42 . In this case, precise and repeatable alignment between detachable tip 42 and elongated portion 46 is required, but it is advantageous because allows different fields of view while eliminating the need for contacts between elongated portion 46 and detachable tip 42 .
  • imager 12 is located at the distal end of insertion tube 40 .
  • imager 12 may be located at the proximal end of insertion tube 40 .
  • the alternative configuration may be suitable, for example, in a rigid borescope or fiberscope.
  • Imager 12 obtains at least one image of the viewed surface.
  • Imager 12 may comprise, for example, a two-dimensional array of light-sensitive pixels that outputs a video signal in response to the light level sensed at each pixel.
  • Imager 12 may comprise a charge-coupled device (CCD), complementary metal-oxide-semiconductor (CMOS) image sensor, or other devices of similar function.
  • CCD charge-coupled device
  • CMOS complementary metal-oxide-semiconductor
  • Imager interface electronics 31 may include, for example, power supplies, a timing generator for generating imager clock signals, an analog front end for digitizing the imager video output signal, and a digital signal processor (DSP) 51 for processing the digitized imager video data into a more useful format for video processor 50 .
  • DSP digital signal processor
  • Video processor 50 performs various functions not limited to image capture, image enhancement, graphical overly merging, and video format conversion and stores information relating to those functions in video memory 52 .
  • Video processor 50 may comprise field-programmable gate array (FPGA), camera DSP, or other processing elements and provides information to and receives information from central processing unit (CPU) 56 . The provided and received information may relate to commands, status information, video, still images, and/or graphical overlays.
  • Video processor 50 also outputs signals to various monitors such as computer monitor 22 , video monitor 20 , and integral display 21 .
  • Video processor 50 also comprises motion detection module 53 and/or fringe contrast determining function 54 .
  • CPU 56 or microcontroller 30 described below, or probe electronics 48 comprising camera control electronics (not shown), may include motion detection module 53 .
  • each of computer monitor 22 , video monitor 20 , and/or integral display 21 When connected, each of computer monitor 22 , video monitor 20 , and/or integral display 21 typically display images of the object or surface under inspection, menus, cursors, and measurement results.
  • Computer monitor 22 is typically an external computer type monitor.
  • video monitor 20 typically includes an external video monitor.
  • Integral display 21 is integrated and built into probe system 10 and typically comprises a liquid crystal display (LCD).
  • CPU 56 preferably uses both program memory 58 and non-volatile memory 60 , which may include removable storage devices. CPU 56 may also use volatile memory such as RAM for program execution and temporary storage.
  • a keypad 64 and joystick 62 convey user input to CPU 56 for such functions as menu selection, cursor movement, slider adjustment, and articulation control.
  • Computer I/O interface 66 provides various computer interfaces to CPU 56 such as USB, Firewire, Ethernet, audio I/O, and wireless transceivers. Additional user I/O devices such as a keyboard or mouse may be connected to computer I/O interface 66 to provide user control.
  • CPU 56 generates graphical overlay data for display, provides recall functions and system control, performs phase-shift analysis and measurement processing, and provides image, video, and audio storage.
  • CPU 56 and the previously discussed video processor 50 may be combined into one element of probe system 10 .
  • components of probe system 10 including, but not limited to, CPU 56 and video processor 50 may be integrated and built into probe system 10 or, alternatively, be externally located.
  • the structured-light pattern preferably comprises parallel light and dark lines comprising sinusoidal intensity profiles. Line patterns having square, trapezoidal, triangular, or other profiles may be projected on the surface as well when used with appropriate phase-shift analysis to determine phase of the pattern.
  • the pattern may also comprise other than straight, parallel lines. For example, curved lines, wavy lines, zigzagging lines, or other such patterns may be used with appropriate analysis.
  • the structured-light pattern projected from the at least one emitter module 37 may be created a number of ways.
  • Emitter module 37 may comprise at least one light emitting element formed to include appropriate parallel light and dark lines. Light from the light emitting element may be passed through intensity modulating element 38 .
  • emitter module 37 may comprise a plurality of light emitters. The plurality of light emitters may be strategically positioned to form a structured-light pattern on the surface and/or light from the plurality of light emitters may be passed through intensity modulating element 38 .
  • intensity modulating element 38 comprises a line grating, which creates a structured-light pattern when light from emitter module 37 passes through to the surface or object (not shown).
  • a plurality of fringe sets are projected from the probe onto the viewed surface or object.
  • a fringe set comprises at least one structured-light pattern.
  • the structured-light pattern of one fringe set exhibits a spatial or phase-shift relative to the structured-light patterns of other fringe sets.
  • the structured-light pattern preferably comprises parallel light and dark lines comprising sinusoidal intensity profiles. Line patterns having square, trapezoidal, triangular, or other profiles may be projected on the surface as well when used with appropriate phase-shift analysis to determine phase of the pattern.
  • the pattern may also comprise other than straight, parallel lines. For example, curved lines, wavy lines, zigzagging lines, or other such patterns may be used with appropriate analysis.
  • a fringe set comprises a structured-light pattern projected when one emitter group comprising a group of at least one light emitter is emitting light.
  • a different subset of the plurality of light emitters emits light to project each of a plurality of structured-light patterns.
  • the plurality of light emitters of emitter module 37 are positioned such that the structured-light pattern projected when one emitter group is emitting light exhibits a spatial or phase-shift relative to the structured-light patterns projected when other emitter groups are emitting light.
  • Light from the plurality of light emitters disposed on detachable tip 42 is passed through at least one intensity modulating element 38 to alter the distribution of light and project at least one structured-light pattern on the viewed surface suitable for phase-shift analysis.
  • the plurality of light emitters comprising an emitter group are spaced apart along the axis perpendicular to the lines on the line grating by a distance equal to an integer number of periods of the line grating.
  • the plurality of light emitters in an emitter group are arranged in a line parallel to the lines on the line grating and are electrically connected in series. This approach reduces the current needed to achieve a given light output relative to the current that would be required with a single emitter. This is beneficial as the emitter power is generally supplied through small wires having significant resistance, and reducing the drive current reduces the power dissipated in the wires and supplied by the emitter drive circuit.
  • a plurality of light emitting diodes may comprise the plurality of light emitters of the at least one emitter module 37 .
  • LEDs are practical in probe system 10 at least because LEDs offer consistent, uniform illumination, no speckling, and fast switching between fringe sets.
  • any light emitting source(s) offering the qualities mentioned above are sufficient for use in probe system 10 .
  • Other such light sources include, but are not limited to, organic LEDs, plasma elements, fiber coupled lasers, and laser arrays.
  • the at least one emitter module 37 on detachable tip 42 may further comprise electronics for control/sequencing of emitters, sensing temperature, and storage/retrieval of calibration data.
  • the at least one emitter module 37 may include a heat sink made of a ceramic or metal, for example, to reduce the temperature rise of the plurality of light emitters.
  • System 10 further comprises contacts 36 that electrically couple elongated portion 46 to detachable tip 42 through the camera head.
  • Contacts 36 may be spring loaded and also provide electrical power from drive conductor 35 to emitter module 37 .
  • drive conductor 35 carries power from emitter drive 32 to the plurality of light emitters disposed on the distal end of insertion tube 40 .
  • Drive conductor 35 comprises one or more wires and may be incorporated with signal line 14 in a common outer jacket (not shown).
  • Drive conductor 35 may also share conductors with signal line 14 and/or utilize the insertion tube 40 structure for carrying current.
  • Emitter drive 32 includes, for example, an adjustable current source with a variable on time to compensate for light emitters with differing power capabilities and efficiencies.
  • video processor 50 or CPU 56 comprises a brightness or fringe contrast determining function 54 to determine whether one emitter or multiple emitters should be enabled for each emitter group.
  • brightness determining function 54 communicates with emitter drive 32 to selectively transmit current through specific wires connected to emitter module 37 to light an appropriate number of emitters per emitter group. Further control over brightness can be achieved by varying the drive level applied to the emitters or the duration of time the emitters are driven.
  • brightness determining function 54 When brightness determining function 54 is located separately from emitter drive 32 one drive wire of drive conductor 35 connects emitter drive 32 to emitter module 37 , and one or more control wires (not shown) controlled by brightness determining function 54 are also connected to emitter module 37 .
  • a circuit (not shown) included on emitter module 37 can selectively connect one or multiple emitters to the drive wire in response to signals on the control wire.
  • emitter drive 32 comprises brightness determining function 54
  • drive conductor 35 comprises one or more drive wires (not shown) per emitter. In this case, brightness determining function 54 selectively transmits current through specific drive wires of drive conductor 35 to light an appropriate number of emitters per emitter group.
  • At least one calibrating-light pattern is projected onto the viewed surface or object. Projecting light from at least one of the plurality of light emitters may be used to create the at least one calibrating-light pattern on the surface or object.
  • the calibrating-light pattern may comprise at least one structured-light pattern, and passing light from at least one of the plurality of light emitter through intensity modulating element 38 may create at least one calibrating-light pattern on the object.
  • the calibrating-light pattern may include, but is not limited to, angled lines, a single line, a plurality of lines, a dot, a plurality of dots, and a plurality of parallel light and dark lines. It can be appreciated that fringe sets and calibrating-light patterns may be projected from the same emitter module 37 . This may be accomplished, for example, by spacing apart fringe set emitters and calibrating pattern emitters and passing light from them through separate areas of intensity modulating element 38 .
  • a first projection set and a second projection set is projected onto a surface.
  • a projection set comprises at least one fringe set comprising a structured-light pattern.
  • the structured-light pattern of one fringe set of the first projection set exhibits a phase-shift relative to the structured-light patterns of the other fringe sets of the first projection set.
  • the structured-light pattern of one fringe set of the second projection set exhibits a phase-shift relative to the structured-light patterns of other fringe sets of the second projection set.
  • the first projection set is projected from one side of viewing optics 44 and the second projection set is projected from the other side of viewing optics 44 .
  • a first projection set may alternatively be projected from the top of viewing optics 44 and a second projection set may be projected from the bottom of viewing optics 44 , or vise versa. Even if insertion tube 40 is rotated, the first and second projection sets are projected from opposite positions or angles relative to the FOV. Therefore, the first projection set may be projected from any position or angle around viewing optics 44 that is opposite that of the second projection set.
  • Fringe sets 0 , 1 , and 2 of FIG. 2 comprise an exemplary projection set.
  • a plurality of fringe sets comprise the projection set.
  • the plurality of fringe sets comprising the projection set are typically projected from approximately the same origin relative to the FOV.
  • FIG. 3 shows a graph of two fringe sets.
  • Each fringe set is a projection from opposite sides of the FOV.
  • the fringe set in FIG. 3 represented by the solid lines projected from one side of the FOV comprises a first projection set
  • the fringe set in FIG. 3 represented by the dashed lines projected from the other side of the FOV comprises a second projection set.
  • a projection set comprises a plurality of fringe sets, each fringe set comprising a structured-light pattern, wherein the light pattern of one fringe set exhibits a phase-shift relative to the light patterns of the other fringe sets.
  • the first image set comprises fringe set images of the first projection set
  • the second image set comprises fringe set images of the second image set, where one fringe set is projected onto the surface or object per image.
  • the probe operates in measurement mode when the at least one structured-light pattern is projected onto the surface.
  • emitter module 37 is enabled to project at least one structured-light pattern on the surface during measurement mode.
  • CPU 56 or video processor 50 captures a plurality of measurement mode images wherein the at least one structured-light pattern is projected onto the object.
  • the measurement mode images may comprise fringe sets where no more than one fringe set is projected onto the object per measurement mode image. Measurement mode images of that sort are also referred to herein as fringe set images. Phase-shift analysis may then be performed directly on the plurality of fringe set images.
  • the probe operates in inspection mode when inspection light source 23 is enabled.
  • Light is projected from inspection light source 23 onto a surface or object.
  • the at least one structured-light pattern may be absent.
  • at least one image referred to herein as an inspection mode image, is captured when light is projected from inspection light source 23 onto the viewed surface or object.
  • Inspection light source 23 outputs relatively uniform light or diffuse illumination light from the distal end of insertion tube 40 .
  • the elements that produce and deliver light during inspection mode may collectively be referred to as an inspection light delivery system.
  • the inspection light delivery system comprises inspection light source 23 , source fiber bundle 24 , shutter mechanism 34 , probe fiber bundle 25 , and light passing element 43 .
  • the inspection light delivery system may comprise very different elements such as, in the case of distally-located white LEDs, an LED drive circuit that can be disabled or provides an adjustable output current, wires for delivering power to the LEDs, the LEDs themselves, and a protective element to protect the LEDs.
  • the inspection light delivery system comprises a proximal LED coupled to a fiber bundle, which delivers light to the distal end of insertion tube 40 , and an LED drive circuit.
  • the intensity of light output from the inspection mode light delivery system originating from light source 23 is automatically decreased or disabled during measurement mode to avoid reducing the contrast of the at least one projected structured-light pattern.
  • CPU 56 may be configured to give an original command to turn off light source 23 electronically prior to projecting the at least one structured-light pattern through an enable/disable input to light source 23 .
  • Inspection light source 23 may then be automatically enabled, for example, electronically after a plurality of measurement mode images are captured or upon exiting measurement mode.
  • Shutter mechanism 34 is configured to allow light output from the inspection light delivery system during inspection mode or regular inspection and block or otherwise inhibit light output originating from inspection light source 23 during measurement mode.
  • Shutter mechanism 34 includes, for example, a solenoid or motor driven mechanical shutter or an electric light source disabler. When shutter mechanism 34 allows light from inspection light source 23 to pass, shutter mechanism 34 is in an open position. When shutter mechanism 34 blocks light from inspection light source 23 , shutter mechanism 34 is in a closed position. During inspection mode, shutter mechanism 34 is configured to be in an open position. In contrast, during fringe set projection, shutter mechanism 34 is configured to be in a closed position. The location of shutter mechanism 34 can vary based on its implementation. In an embodiment of the invention, when shutter mechanism 34 allows light to pass, probe fiber bundle 25 delivers light to the surface or inspection site via light passing element 43 .
  • Inspection light source 23 is typically a white light source, but may comprise any appropriate light source for a probe such as a mercury or metal halide arc lamp, halogen lamp, laser/phosphor system, or LED based light source which could be either proximally or distally located.
  • source fiber bundle 24 may be included in system 10 .
  • Source fiber bundle 24 comprises a non-coherent or semi-coherent fiber optic bundle and transmits light to shutter mechanism 34 .
  • source fiber bundle 24 may be omitted, and shutter mechanism 34 may be located directly between inspection light source 23 and probe fiber bundle 25 .
  • Probe fiber bundle 25 comprises a non-coherent fiber optic bundle.
  • Light passing element 43 comprises a glass cane, formed fibers, and/or distribution control features such as lenses or a diffuser.
  • At least one counterpart inspection mode image and at least one counterpart measurement mode images are captured.
  • the at least one counterpart measurement mode image comprises at least one of the plurality of measurement mode images
  • the at least one counterpart inspection mode image comprises at least one inspection mode image captured in close time proximity to the at least one counterpart measurement mode image. Inspection mode images and measurement mode images captured in close proximity are referred to herein as counterpart images.
  • counterpart images comprise images of an object in the same position relative to the FOV.
  • Capturing counterpart images in close time proximity is advantageous at least because the relative movement between the probe's distal tip and the viewed object between the captures the counterpart images is minimized.
  • Geometrical features such as defects and edges, will appear in the same position in the counterpart images so that the locations of cursors positioned on an inspection-mode image correspond to the same points on the viewed object in the measurement-mode images.
  • motion detection module 53 analyzes inspection mode and measurement mode counterpart images.
  • Motion detection module 53 may be configured to analyze the images once all of the images have been captured. Alternatively, motion detection module 53 may be configured to analyze the images sequentially after the capture of each image. Motion detection module 53 is configured to automatically detect probe and/or surface movement between measurement mode images, also referred to herein as fringe set images or the images captured comprising structured-light patterns. Motion detection module 53 may be configured to compare only inspection mode images or only measurement mode images. Furthermore, motion detection module 53 may optionally be configured to compare at least one measurement mode image with its counterpart inspection mode image(s). In an embodiment of the invention, counterpart inspection mode image(s) may be captured at the beginning and/or the end of its counterpart measurement mode capture sequence.
  • Motion detection module 53 could be further configured to compare one or more captured images from each of two or more successive measurement mode capture sequences such that images having the same illumination and/or structured light patterns present may be compared rather than attempting to compensate for differences in pattern position or illumination.
  • measurement mode capture sequence used herein is defined as the capture of a plurality of structured-light images, each captured image comprising one projected fringe set. During a measurement mode capture sequence a plurality of measurement mode images are captured.
  • Probe system 10 is configured to detect relative movement between the probe and the surface or object between the captures of two or more of a plurality of images.
  • motion detection module 53 is configured to analyze the images captured and compute a motion metric indicative of relative movement between the probe's distal tip and the surface or object between the captures of two or more of a plurality of images.
  • These images may comprise the first and the last of a plurality of images.
  • the first of the plurality of images is either an inspection mode image or a measurement mode image.
  • the last of the plurality of images is either an inspection mode image or a measurement mode image.
  • the capture of the plurality of images is repeated until either the motion metric indicates a low probability of movement or a pre-determined timeout occurs. For example, if the motion metric indicates a high probability of movement between counterpart measurement mode and inspection mode images, the capture of the measurement mode and inspection mode image(s) is repeated until the motion metric indicates a low probability of movement or a pre-determined timeout occurs.
  • the motion metric indicates a high probability of movement between counterpart measurement mode and inspection mode images
  • the capture of the measurement mode and inspection mode image(s) is repeated until the motion metric indicates a low probability of movement or a pre-determined timeout occurs.
  • re-capture of the entire plurality of images may not always be necessary.
  • the value of the motion metric can depend upon the implementation of motion detection.
  • the metric could be pixels of movement, in which case, the metric may be limited to a one pixel movement, for example, for indicating a low probability of movement. In that case any metric representing a movement greater than one pixel would indicate a high probability of movement.
  • the metric limit for a low probability of movement could also be experimentally determined.
  • one method for experimentally determining metric limits includes using a root mean square (RMS) difference between brightness values.
  • RMS root mean square
  • FIG. 4 is a flow chart illustrating an exemplary embodiment of the steps involved in motion detection.
  • Borescope/endoscope or probe system 10 shown in FIG. 1 is configured to perform the steps indicated in method 400 .
  • Method 400 may be implemented when probe system 10 is in inspection mode, and the CPU 56 receives a command requesting measurement. An operator may request measurement by pressing a button (not shown) on the probe system 10 or selecting a menu item from, for example, integral display 21 .
  • CPU 56 or video processor 50 captures a first inspection mode image.
  • CPU 56 then sends a command to microcontroller 30 to enter measurement mode.
  • Microcontroller 30 controls emitter drive 32 to perform a measurement mode capture sequence.
  • CPU 56 or video processor 50 captures measurement mode images.
  • At least one measurement mode image is captured per structured-light pattern or fringe set.
  • An operator may pre-program the specifics of the measurement mode capture sequence before the implementation of method 400 by selecting a menu item from integral display 21 . For example, the operator may desire the capture of a plurality of measurement mode images per structured-light pattern or fringe set. In addition, those images of the same structured-light pattern or fringe set may be captured at the same brightness level or at different brightness levels depending on the analysis and/or mapping desired.
  • emitter drive 32 is disabled by microcontroller 30 , and microcontroller 30 configures DSP 51 for inspection mode.
  • CPU 56 or video processor 50 captures a second inspection mode image.
  • Motion detection module 53 then analyzes the first and second inspection mode images to determine a motion metric at step 408 . If the motion metric indicates an unacceptable degree of motion, and the pre-set time limit is not reached, steps 402 - 412 are repeated until the motion metric indicates an acceptable degree of motion or the pre-set time limit is reached. Alternatively, if the motion metric indicates an unacceptable degree of motion at step 410 , and the pre-set time limit is reached at step 412 , the process ends at step 99 . If the motion metric indicates an acceptable degree of motion, the process ends at step 99 .
  • motion detection module 53 is configured to analyze the images captured based on techniques such as high-frequency detail position comparison. Points in the images that include fast transitions in brightness can be identified in the first image in the sequence, and those points can be checked in one or more subsequent images to determine whether the fast transitions still occur at the same points. This approach can accommodate differences in illumination as would exist between measurement mode and inspection mode images. Images captured under the same lighting conditions, such as inspection-mode images captured before and after the counterpart measurement-mode images, can be simply subtracted from one another to determine whether the image has substantially changed.
  • FIG. 5 is a flow chart illustrating an exemplary alternative embodiment of the steps involved in motion detection.
  • Borescope/endoscope or probe system 10 shown in FIG. 1 is configured to perform the steps indicated in method 500 .
  • Method 500 may be implemented when probe system 10 is in inspection mode, and the CPU 56 receives a command requesting measurement. An operator may request measurement by pressing a button (not shown) on the probe 10 or selecting a menu item from, for example, integral display 21 .
  • CPU 56 or video processor 50 captures an inspection mode image.
  • the CPU 56 or video processor 50 identifies sharp brightness transition points in the inspection mode image captured at step 502 .
  • CPU 56 then sends a command to microcontroller 30 to enter measurement mode.
  • microcontroller 30 controls emitter drive 32 to illuminate one emitter group to project a first fringe set.
  • CPU 56 or video processor 50 captures the first fringe set images. At least one measurement mode image is captured for the first fringe set.
  • the CPU 56 or video processor 50 identifies sharp brightness transition points in at least one of the first fringe set images captured at step 506 .
  • Motion detection module 53 then compares the identified sharp brightness transition points of the inspection mode image with those of the fringe set image(s) at step 510 . At step 512 motion detection module 53 determines a motion metric based on that comparison. If the motion metric indicates an unacceptable degree of motion at step 514 , and the time limit is reached at step 516 , the process ends at step 99 .
  • steps 502 - 516 are repeated until the motion metric indicates an acceptable degree of motion or the pre-set time limit is reached.
  • the sequence is repeated from step 502 to update the inspection mode image because it is unlikely that the measurement mode images will again line up with the original inspection image.
  • the sequence may be repeated from step 506 to compare the captured fringe set images or two or more measurement mode images comprising the same structured-light pattern to each other until they all match up and then capture another inspection mode image at the end.
  • steps 506 - 514 are repeated for the second fringe set, then the third fringe set, etc.
  • the process ends after sequencing through steps 506 - 514 for the last fringe set fringe set once all of the fringe set images are captured for that last fringe set and the motion metric indicates an acceptable degree of motion for the fringe set image(s) in the last fringe set.
  • Probe electronics 48 may be physically separated from a main control unit or CPU 56 to provide more local control over probe-related operations.
  • Probe electronics 48 further comprise calibration memory 33 .
  • Calibration memory 33 stores information relating to the optical system of detachable tip 42 and/or elongated portion 46 such as magnification data, optical distortion data, and pattern projection geometry data.
  • Calibration memory 33 stores information relating to the intensity relationship between the light projected from light source 23 and the light projected from emitter module 37 .
  • the intensity relationship between of the light projected by light source 23 and emitter module 37 can be pre-determined before any image capture.
  • the brightness or intensity from light source 23 is greater than the brightness or intensity from emitter module 37 . Therefore, the imager 12 exposure time and/or the analog gain applied to video signal output by imager 12 during inspection mode image capture should be different from those during measurement mode image capture.
  • Microcontroller 30 which controls shutter 34 , communicates with CPU 56 and controls emitter drive 32 circuitry, and also communicates with imager interface electronics 31 to determine and set gain and exposure settings, and stores and reads calibration data from the calibration memory 33 .
  • Probe system 10 further comprises one or more of a gain function, an exposure function, a gamma correction function and an edge enhancement function applied to image data originating from imager 12 .
  • Probe system 10 is configured to automatically adjust the parameters of at least one of said functions when switched between inspection mode image capture and measurement mode.
  • the relative intensities of the light output by the inspection light delivery system and the structured-light patterns is determined during a calibration step and stored in calibration memory 33 .
  • DSP 51 included in imager interface electronics 31 may be configured to automatically adjust imager 12 exposure and front end analog gain to achieve optimal image brightness for inspection mode image capture.
  • Microcontroller 30 is configured to compute parameters of the exposure function and gain function settings from DSP 51 to use during measurement mode using exposure function and gain function values that are active during inspection mode. Microcontroller 30 is further configured to compute parameters of the exposure and gain functions according to a pre-determined intensity relationship between the light of the structured-light patterns and the light from inspection light source 23 and set DSP 51 to apply the adjusted exposure and gain settings to optimize image brightness for measurement mode image capture. For example, the parameters of the exposure and gain functions are adjusted such that the brightness in the plurality of fringe set images is similar to the brightness in the inspection mode image(s).
  • DSP 51 may be again configured for automatic gain and exposure adjustment to optimize image brightness for inspection-mode image capture.
  • FIG. 6 is a flow chart illustrating an exemplary embodiment of the steps involved during an image capture sequence of the present invention.
  • image capture sequence used herein is defined as the capture of counterpart inspection mode and measurement mode images.
  • image capture sequence is not to be confused with the term “measurement mode capture sequence” defined above.
  • Borescope/endoscope or probe system 10 shown in FIG. 1 is configured to perform the steps indicated in method 600 .
  • Method 600 may be implemented when probe system 10 is in inspection mode, and the CPU 56 receives a command requesting measurement. An operator may request measurement by pressing a button (not shown) on the probe 10 or selecting a menu item from, for example, integral display 21 .
  • CPU 56 or video processor 50 captures the inspection mode image(s).
  • CPU 56 sends a command to microcontroller 30 to enter measurement mode.
  • microcontroller 30 reads the analog gain and exposure from DSP 51 , and at step 608 , the microcontroller 30 adjusts the gain and exposure for measurement mode. Discussed above, the measurement mode DSP 51 settings may be adjusted according to a predetermined intensity relationship between the inspection light delivery system and the structured light patterns. Further at step 608 , microcontroller 30 sets DSP 51 to fixed gain and exposure based on the adjusted values.
  • the inspection light is disabled by microcontroller 30 or CPU 56 as discussed previously.
  • microcontroller 30 controls emitter drive 32 to perform a measurement mode capture sequence.
  • performing a measurement mode capture sequence comprises sequencing through emitter groups, different subsets of light emitters, on frame boundaries while possibly adjusting on time or drive level to compensate for different emitter brightness levels.
  • a drive level supplied to one subset light emitters may be adjusted to compensate for a temperature difference between that subset of light emitters and another subset of light emitters.
  • Different emitter brightness levels may be due to differing emitter efficiencies or to heating of the emitters as the sequence progresses. For example, if the emitters are LEDs, the efficiency generally decreases as temperature increases. When the first LED is turned on, emitter module 37 is cooler than when the last LED is turned on.
  • the last LED requires more drive current to achieve the same output as the first LED.
  • the difference in drive levels may be predetermined through a calibration step.
  • LED forward voltage drop also typically increases as temperature increases.
  • microcontroller 30 in conjunction with emitter drive 32 may measure the LED forward drop to determine LED temperature to more accurately compensate for the efficiency change.
  • CPU 56 or video processor 50 captures measurement mode images. At least one measurement mode image is captured per fringe set. In addition, a plurality of measurement mode images may be captured per fringe set such that measurement mode images of each fringe set are captured at the same brightness level; also, a plurality of measurement mode images may be captured per fringe set such that the plurality of fringe set images of at least one fringe set are captured at different brightness levels.
  • Motion detection module 53 analyzes the images for motion at step 616 . If motion is detected at step 618 , and the pre-set time limit is reached at step 620 , the process ends at step 99 . If motion is detected, and the pre-set time limit is not reached, steps 612 - 620 are repeated until motion is not detected or the pre-set time limit is reached. Alternatively, if motion is not detected at step 618 , CPU 56 sends a command to microcontroller 30 to enter inspection mode. At step 624 , emitter drive 32 is disabled by microcontroller 30 . At step 626 , microcontroller 30 configures DSP 51 for inspection mode by setting DSP 51 for automatic gain and exposure adjustment. At step 628 , CPU 56 or microcontroller 30 enables inspection light output.
  • CPU 56 or video processor 50 may again capture inspection mode image(s), as in step 402 . This marks the end of the image capture sequence.
  • Method 600 may be repeated automatically to sequence through the steps a pre-determined number of times. Alternatively, an operator may manually command the repetition of method 600 by requesting measurement each time a new image capture sequence is desired.
  • probe system 10 does not have to directly enter inspection mode if there is no motion detected at step 618 .
  • the user is given an option to either enter inspection mode at step 622 or to enter a measurement screen (not shown).
  • the measurement screen displays a counterpart inspection mode image, preferably captured from step 602 , while analysis or measurement is performed on the at least one counterpart measurement mode image, preferably captured from step 614 .
  • the measurement screen enables the placement of measurement cursors on the counterpart inspection mode image while the actual analysis or measurement is performed on data representing the at least one counterpart measurement mode image.
  • the emitter drive is disabled so that structured-patterns are not projected.
  • step 624 The sequence resumes at step 626 , where microcontroller 30 configures DSP 51 for inspection mode by setting DSP 51 for automatic gain and exposure adjustment.
  • Probe system 10 is configured to change the parameters of imager 12 analog gain and exposure functions through DSP 51 when switched between inspection mode and measurement mode. Probe system 10 is also configured to automatically adjust other processing parameters of DSP 51 , including, but not limited to, gamma correction and edge enhancement, when switched between inspection mode and measurement mode.
  • imager 12 responds to light in a linear manner.
  • a non-linear re-mapping of the intensity or luminance values is often performed by the DSP 51 to improve the perceived brightness uniformity for image display.
  • Non-linear re-mapping of the intensity values of images captured during inspection mode may be desirable.
  • the probe system 10 is configured to decrease an effective level of gamma correction applied to pixels of at least one measurement mode image relative to the level of gamma correction applied during inspection mode.
  • a linear gamma DSP setting is enabled or switched on during measurement mode.
  • the linear gamma DSP setting is typically disabled, or set to be non-linear, to improve the perceived inspection-mode image quality.
  • edge enhancement artificially modifies the brightness linearity of an image. This is generally not desirable for images on which phase-shift analysis is performed as images representative of a linear response to light are preferred. Therefore, the edge enhancement function is disabled or switched off for measurement mode image capture, and may be enabled or switched on for inspection mode viewing and inspection mode image capture. Probe system 10 is configured to reduce an effective level of edge enhancement applied to pixels of at least one measurement mode image relative to the level of edge enhancement applied during inspection mode.
  • probe system 10 may be configured to capture a plurality of measurement mode images with the same structured-light pattern or same fringe set present and average or sum two or more of those measurement mode images. The result is a plurality of composite images where only one fringe set is present in each composite image. These plurality of measurement mode images with the same structured-light pattern or same fringe set present may be captured with the same or similar brightness levels.
  • the composite image of a projected fringe set a result of summing rather than averaging, the dynamic range is increased and noise is reduced. Whether the composite images are a result of summing or averaging, phase-shift analysis and other processes can be performed on the composite images.
  • measurement mode images comprising the same structured-light pattern or same projected fringe set may be captured with different brightness or intensity levels. This is accomplished by either manually or automatically changing the emitter output intensity or duration, imager 12 exposure, analog gain, or some combination thereof.
  • fringe contrast determining function 54 is configured to determine whether one emitter or multiple emitters should be enabled for each emitter group. In order to change the light source intensity, for example, fringe contrast determining function 54 may further be configured to sequence between enabling one emitter and multiple emitters per emitter group to project light. Also discussed above, microcontroller 30 communicates with imager interface electronics 31 to determine and set gain and exposure settings. Similarly, microcontroller 30 may configure emitter drive 32 to alter the amount of power delivered to emitter module 37 and thus vary the intensities of the projected fringe sets.
  • a plurality of measurement mode images captured with different brightness or intensity levels comprising the same projected fringe set may be combined to effectively increase the dynamic range of the system.
  • an image may be captured of a shiny metal surface comprising a structured-light pattern. Reflective properties of the shiny metal surface may prevent adequate intensity levels in dark areas. Therefore, multiple images for each fringe set may be captured with different intensity levels so that, in at least some images, the dark areas are properly illuminated. Then, the captured images for each fringe set may be combined resulting in a single image for each fringe set with sufficient intensity levels across a larger portion of the image than could be achieved with a single image per fringe set.
  • CPU 56 or video processor 50 may be configured to analyze an inspection-mode image prior to capturing the measurement-mode images to determine brightness uniformity. The analysis may be performed by generating and evaluating a histogram of pixel luminance values. A histogram having most of the pixels in a middle brightness range would indicate that a single set of measurement-mode images would likely be adequate. A histogram having mostly very bright pixels and very dark pixels may indicate a highly reflective surface that would benefit from the merging of multiple measurement-mode image sequences captured at different brightness levels.
  • multiple measurement mode images that comprise the same projected structured-light pattern or fringe set and are captured with different intensity levels are combined by evaluating each pixel intensity in the multiple images and selecting the set of pixels that best meets a set of criteria, such as maximum modulation, brightness values, or pixel unsaturation. Pixel values from more than one measurement mode image comprising the same structured-light pattern or same fringe set may be utilized to determine at least one geometric dimension of the surface or object.
  • the discussion above generally relates to image capture by probe system 10 .
  • the discussion below generally relates to the use and storage of those captured images.
  • image(s) comprising structured-light patterns captured during measurement mode are hidden from the operator, and are used only for the actual analysis and measurement.
  • CPU 56 or video processor 50 creates an image file comprising data representing a counterpart inspection mode image and creates a hidden record within the image file comprising data representing its at least one counterpart measurement mode image.
  • the counterpart inspection mode image can be displayed while the operator positions overlay cursors for determining geometric measurements of a viewed object while analysis is performed on the counterpart measurement mode image to determine the geometric measurement results. Therefore, the operator may command analysis or measurement while viewing an inspection mode image even though the analysis or measurement is performed on its counterpart measurement mode image or images. This can also be done prior to image storage, for example, as previously discussed in relation to FIG. 6 , where the inspection mode image is displayed and the measurement images are processed.
  • the operator skill requirement is reduced by allowing the user to place cursors on a normal, inspection mode, image without having to worry about stereo matching, perpendicularity, shadow position, etc.
  • the operator may use joystick 62 to place cursors on an inspection mode image displayed on integral display 21 .
  • keypad 64 and/or computer I/O interface 66 may also be used to place cursors, and interchangeably with integral display 21 , computer monitor 22 and/or video monitor 20 may also be used to display the inspection mode image.
  • graphical overlay data is added to the image such as cursor positions, measurement results, accuracy indicators, etc.
  • This graphical overlay must be removable to enable easy re-measurement at a later time where the non-graphical data related to the image is required.
  • a method for storing image-specific data from a probe, specifically calibration data is known from U.S. Pat. No. 7,262,797. It is further desirable to store measurement data, for example, relating to phase-shift analysis. Measurement data includes, but is not limited to the luminance portion of a structured-light image (luminance data), measurement cursor positions, merged image data, measurement types, results, accuracy indications, etc., calibration data, phase data, and phase-shift analysis data.
  • Phase data may include data representing the phases of the structured-line patterns, for example wrapped phase, unwrapped phase, relative phase, and/or absolute phase.
  • Phase-shift analysis data includes, but is not limited to, surface or object distance data (z values for each pixel, where z is the object distance from the probe), and point cloud data (x, y, z values for each pixel).
  • the method disclosed in U.S. Pat. No. 7,262,797 should include the additional step of “writing measurement data to file.”
  • the system may include a geometric measurement mode in which a counterpart inspection mode image is displayed, an operator positions measurement cursors on the inspection mode image to identify measurement points, and CPU 56 computes and displays measurement results based on 3D data derived through phase-shift analysis performed on the counterpart measurement images utilizing calibration data.
  • CPU 56 creates an image file. It merges overlay data with the inspection image data and saves the result in the image file such that when the file is opened by a standard image viewer, the inspection mode image and overlay are displayed.
  • CPU 56 also creates hidden records in the image file. In these hidden records, it stores the inspection image data that was overwritten by the overlay data, referred to as overlay-replacement data, and measurement data.
  • Measurement data in bitmap and JPEG images captured using probe system 10 or an accompanying personal computer application is saved by CPU 56 or video processor 50 .
  • This allows images to have “destructive” overlays that are visible in the image using standard image viewing software, but which are removable by a custom application to present a clean image to the viewer or operator.
  • the clean image can either be a measurement mode image or its counterpart inspection mode image.
  • Storing luminance and measurement data in the image also allows the measurements to be repeated on the image using either the probe software or by a custom program, such as a PC-based software package.
  • the image data can be used in many ways. For example, pixel values from the measurement mode images can be used to determine at least one geometric dimension of the object or surface.
  • image data can be used for performing 3D geometric measurements or 3D visualization.
  • the image data can also be exported or converted to a data format usable with 3D modeling software for detailed analysis or reverse engineering.
  • any means-plus-function clause is intended to cover the structures described herein as performing the recited function and not only structural equivalents but also equivalent structures.
  • Other substitutions, modifications, changes and omissions may be made in the design, operating conditions and arrangement of the preferred and other exemplary embodiments without departing from the spirit of the embodiments of the invention as expressed in the appended claims. Therefore, the technical scope of the present invention encompasses not only those embodiments described above, but also those that fall within the scope of the appended claims.

Abstract

A probe system includes an imager and an inspection light source. The probe system is configured to operate in an inspection mode and a measurement mode. During inspection mode, the inspection light source is enabled. During measurement mode, the inspection light source is disabled, and a structured-light pattern is projected. The probe system is further configured to capture at least one measurement mode image. In the at least one measurement mode image, the structured-light pattern is projected onto an object. The probe system is configured to utilize pixel values from the at least one measurement mode image to determine at least one geometric dimension of the object. A probe system configured to detect relative movement between a probe and the object between captures of two or more of a plurality of images is also provided.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is a Continuation-in-Part of and claims priority from U.S. Ser. No. 12/042,821 filed Mar. 5, 2008 entitled Fringe Projection System and Method for a Probe Suitable for Phase-Shift Analysis, which is incorporated herein by reference.
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The subject matter described herein relates generally to borescopes and endoscopes, and more particularly, to a borescope/endoscope which provides 3D surface mapping and dimensional measurement.
  • 2. Related Art
  • Borescopes and endoscopes are typically used for inspection inside a remote cavity. Most borescopes/endoscopes, referred to herein as a probe, employ an external light source coupled to fiber optic bundles in the probe to provide illumination of a remote object or surface at the distal end. When the object is illuminated, an internal image is formed by a lens system on an image sensor, and the image is relayed to a connected display, such as a video screen. The image sensor may be located at the proximal end of the probe, as with an optical rigid borescope or fiberscope, or at the distal end as with a video borescope or endoscope. Such systems are often used to inspect inaccessible locations for damage or wear or to verify that parts have been properly manufactured or assembled. Among other things, it is desirable to obtain dimensional measurements to verify that damage or wear does not exceed an operational limit or that a manufactured part or assembly meets its specifications. It may also be desirable to produce a 3D model or surface map for comparison to a reference, 3D viewing, reverse engineering, or detailed surface analysis.
  • Phase-shift technology is well suited to addressing these measurement needs, but its implementation in a borescope or endoscope presents numerous system-level challenges. It is desirable to address these challenges in a manner that yields a reliable and easy-to-use system.
  • BRIEF DESCRIPTION OF THE INVENTION
  • In accordance with an embodiment of the present invention, a probe system comprises an imager and an inspection light source. The probe system is configured to operate in an inspection mode and a measurement mode. During inspection mode, the inspection light source is enabled. During measurement mode, the inspection light source is disabled, and a structured-light pattern is projected. The probe system is further configured to capture at least one measurement mode image. In the at least one measurement mode image, the structured-light pattern is projected onto an object. The probe system is configured to utilize pixel values from the at least one measurement mode image to determine at least one geometric dimension of the object.
  • In another embodiment of the invention, a probe system comprises an imager, and the probe system is configured to operate in an inspection mode and a measurement mode. Diffuse illumination light is projected during inspection mode, and a structured-light pattern is projected during measurement mode. The probe system is further configured to capture at least one measurement mode image. In the at least one measurement mode image, the structured-light pattern is projected onto an object. The probe system is configured to utilize pixel values from the at least one measurement mode image to determine at least one geometric dimension of the object. The probe system is also configured to detect relative movement between a probe and the object between captures of two or more of a plurality of images.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The following detailed description is made with reference to the accompanying drawings, in which:
  • FIG. 1 is a schematic diagram of a borescope/endoscope system in accordance with an embodiment of the present invention.
  • FIG. 2 is a graph showing the trajectory of an exemplary projection set projected from one side of the FOV.
  • FIG. 3 is a graph showing the trajectory of the structured-lines of one fringe set in each of a first and second exemplary projection set relative to a field of view.
  • FIG. 4 is a flow chart illustrating an exemplary embodiment of the steps involved in motion detection.
  • FIG. 5 is flow chart illustrating an exemplary alternative embodiment of the steps involved in motion detection.
  • FIG. 6 is a flow chart illustrating an exemplary embodiment of the steps involved during an image capture sequence of the present invention.
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT
  • Illustrated in FIG. 1, a borescope/endoscope system or probe system 10 according to an embodiment of the invention is shown. An insertion tube 40 comprises elongated portion 46 and detachable distal tip 42. Elongated portion 46 comprises a main long, flexible portion, a bending neck, and a camera head. Delineation line 41 shows where the camera head starts on elongated portion 46. The camera head of elongated portion 46 typically includes at least imager 12, electronics 13, and probe optics 15. Detachable distal tip 42 typically attaches to the camera head of elongated portion 46, mentioned above. Detachable distal tip 42 contains viewing optics 44 which are used in combination with probe optics 15 to guide and focus light received from the viewed surface or object (not shown) onto imager 12.
  • The elements shown in tip 42 could alternatively be located on elongated portion 46. These elements include viewing optics 44, at least one emitter module 37, at least one intensity-modulating element 38, and light passing element 43. In addition, the at least one light emitter module 37, comprising a plurality of light emitters, could be fixedly attached to insertion tube 40 while the at least one intensity-modulating element is disposed on detachable tip 42. In this case, precise and repeatable alignment between detachable tip 42 and elongated portion 46 is required, but it is advantageous because allows different fields of view while eliminating the need for contacts between elongated portion 46 and detachable tip 42.
  • Shown in FIG. 1, imager 12 is located at the distal end of insertion tube 40. Alternatively, imager 12 may be located at the proximal end of insertion tube 40. The alternative configuration may be suitable, for example, in a rigid borescope or fiberscope.
  • Imager 12 obtains at least one image of the viewed surface. Imager 12 may comprise, for example, a two-dimensional array of light-sensitive pixels that outputs a video signal in response to the light level sensed at each pixel. Imager 12 may comprise a charge-coupled device (CCD), complementary metal-oxide-semiconductor (CMOS) image sensor, or other devices of similar function. The video signal is buffered by electronics 13 and transferred to imager interface electronics 31 via signal line 14. Imager interface electronics 31 may include, for example, power supplies, a timing generator for generating imager clock signals, an analog front end for digitizing the imager video output signal, and a digital signal processor (DSP) 51 for processing the digitized imager video data into a more useful format for video processor 50.
  • Video processor 50 performs various functions not limited to image capture, image enhancement, graphical overly merging, and video format conversion and stores information relating to those functions in video memory 52. Video processor 50 may comprise field-programmable gate array (FPGA), camera DSP, or other processing elements and provides information to and receives information from central processing unit (CPU) 56. The provided and received information may relate to commands, status information, video, still images, and/or graphical overlays. Video processor 50 also outputs signals to various monitors such as computer monitor 22, video monitor 20, and integral display 21. Video processor 50 also comprises motion detection module 53 and/or fringe contrast determining function 54. Alternatively, CPU 56 or microcontroller 30, described below, or probe electronics 48 comprising camera control electronics (not shown), may include motion detection module 53.
  • When connected, each of computer monitor 22, video monitor 20, and/or integral display 21 typically display images of the object or surface under inspection, menus, cursors, and measurement results. Computer monitor 22 is typically an external computer type monitor. Similarly, video monitor 20 typically includes an external video monitor. Integral display 21 is integrated and built into probe system 10 and typically comprises a liquid crystal display (LCD).
  • CPU 56 preferably uses both program memory 58 and non-volatile memory 60, which may include removable storage devices. CPU 56 may also use volatile memory such as RAM for program execution and temporary storage. A keypad 64 and joystick 62 convey user input to CPU 56 for such functions as menu selection, cursor movement, slider adjustment, and articulation control. Computer I/O interface 66 provides various computer interfaces to CPU 56 such as USB, Firewire, Ethernet, audio I/O, and wireless transceivers. Additional user I/O devices such as a keyboard or mouse may be connected to computer I/O interface 66 to provide user control. CPU 56 generates graphical overlay data for display, provides recall functions and system control, performs phase-shift analysis and measurement processing, and provides image, video, and audio storage. CPU 56 and the previously discussed video processor 50 may be combined into one element of probe system 10. In addition, components of probe system 10 including, but not limited to, CPU 56 and video processor 50 may be integrated and built into probe system 10 or, alternatively, be externally located.
  • Referring to the at least one emitter module 37, light from the at least one emitter module 37 projects at least one structured-light pattern on the surface suitable for phase-shift analysis. The structured-light pattern preferably comprises parallel light and dark lines comprising sinusoidal intensity profiles. Line patterns having square, trapezoidal, triangular, or other profiles may be projected on the surface as well when used with appropriate phase-shift analysis to determine phase of the pattern. The pattern may also comprise other than straight, parallel lines. For example, curved lines, wavy lines, zigzagging lines, or other such patterns may be used with appropriate analysis.
  • The structured-light pattern projected from the at least one emitter module 37 may be created a number of ways. Emitter module 37 may comprise at least one light emitting element formed to include appropriate parallel light and dark lines. Light from the light emitting element may be passed through intensity modulating element 38. Alternatively, emitter module 37 may comprise a plurality of light emitters. The plurality of light emitters may be strategically positioned to form a structured-light pattern on the surface and/or light from the plurality of light emitters may be passed through intensity modulating element 38. In an embodiment of the present invention, intensity modulating element 38 comprises a line grating, which creates a structured-light pattern when light from emitter module 37 passes through to the surface or object (not shown).
  • A plurality of fringe sets are projected from the probe onto the viewed surface or object. A fringe set comprises at least one structured-light pattern. The structured-light pattern of one fringe set exhibits a spatial or phase-shift relative to the structured-light patterns of other fringe sets. The structured-light pattern preferably comprises parallel light and dark lines comprising sinusoidal intensity profiles. Line patterns having square, trapezoidal, triangular, or other profiles may be projected on the surface as well when used with appropriate phase-shift analysis to determine phase of the pattern. The pattern may also comprise other than straight, parallel lines. For example, curved lines, wavy lines, zigzagging lines, or other such patterns may be used with appropriate analysis.
  • When emitter module 37 comprises a plurality of light emitters, a fringe set comprises a structured-light pattern projected when one emitter group comprising a group of at least one light emitter is emitting light. In other words, a different subset of the plurality of light emitters emits light to project each of a plurality of structured-light patterns. The plurality of light emitters of emitter module 37 are positioned such that the structured-light pattern projected when one emitter group is emitting light exhibits a spatial or phase-shift relative to the structured-light patterns projected when other emitter groups are emitting light.
  • Light from the plurality of light emitters disposed on detachable tip 42 is passed through at least one intensity modulating element 38 to alter the distribution of light and project at least one structured-light pattern on the viewed surface suitable for phase-shift analysis. In one embodiment, the plurality of light emitters comprising an emitter group are spaced apart along the axis perpendicular to the lines on the line grating by a distance equal to an integer number of periods of the line grating. As a result, when the plurality of light emitters comprising one emitter group are simultaneously emitting light, the structured-light patterns produced by each of the multiple emitters sum together. This forms a brighter line pattern than would be generated by a single emitter element.
  • In another embodiment, the plurality of light emitters in an emitter group are arranged in a line parallel to the lines on the line grating and are electrically connected in series. This approach reduces the current needed to achieve a given light output relative to the current that would be required with a single emitter. This is beneficial as the emitter power is generally supplied through small wires having significant resistance, and reducing the drive current reduces the power dissipated in the wires and supplied by the emitter drive circuit.
  • A plurality of light emitting diodes (LEDs) may comprise the plurality of light emitters of the at least one emitter module 37. LEDs are practical in probe system 10 at least because LEDs offer consistent, uniform illumination, no speckling, and fast switching between fringe sets. However, any light emitting source(s) offering the qualities mentioned above are sufficient for use in probe system 10. Other such light sources include, but are not limited to, organic LEDs, plasma elements, fiber coupled lasers, and laser arrays.
  • The at least one emitter module 37 on detachable tip 42 may further comprise electronics for control/sequencing of emitters, sensing temperature, and storage/retrieval of calibration data. The at least one emitter module 37 may include a heat sink made of a ceramic or metal, for example, to reduce the temperature rise of the plurality of light emitters.
  • System 10 further comprises contacts 36 that electrically couple elongated portion 46 to detachable tip 42 through the camera head. Contacts 36 may be spring loaded and also provide electrical power from drive conductor 35 to emitter module 37. In an embodiment of the invention, drive conductor 35 carries power from emitter drive 32 to the plurality of light emitters disposed on the distal end of insertion tube 40. Drive conductor 35 comprises one or more wires and may be incorporated with signal line 14 in a common outer jacket (not shown). Drive conductor 35 may also share conductors with signal line 14 and/or utilize the insertion tube 40 structure for carrying current. Emitter drive 32 includes, for example, an adjustable current source with a variable on time to compensate for light emitters with differing power capabilities and efficiencies.
  • Discussed above, video processor 50 or CPU 56 comprises a brightness or fringe contrast determining function 54 to determine whether one emitter or multiple emitters should be enabled for each emitter group. In an embodiment of the present invention, brightness determining function 54 communicates with emitter drive 32 to selectively transmit current through specific wires connected to emitter module 37 to light an appropriate number of emitters per emitter group. Further control over brightness can be achieved by varying the drive level applied to the emitters or the duration of time the emitters are driven.
  • When brightness determining function 54 is located separately from emitter drive 32 one drive wire of drive conductor 35 connects emitter drive 32 to emitter module 37, and one or more control wires (not shown) controlled by brightness determining function 54 are also connected to emitter module 37. A circuit (not shown) included on emitter module 37 can selectively connect one or multiple emitters to the drive wire in response to signals on the control wire. Alternatively, when emitter drive 32 comprises brightness determining function 54, drive conductor 35 comprises one or more drive wires (not shown) per emitter. In this case, brightness determining function 54 selectively transmits current through specific drive wires of drive conductor 35 to light an appropriate number of emitters per emitter group.
  • In an embodiment of the invention, at least one calibrating-light pattern is projected onto the viewed surface or object. Projecting light from at least one of the plurality of light emitters may be used to create the at least one calibrating-light pattern on the surface or object. The calibrating-light pattern may comprise at least one structured-light pattern, and passing light from at least one of the plurality of light emitter through intensity modulating element 38 may create at least one calibrating-light pattern on the object. The calibrating-light pattern may include, but is not limited to, angled lines, a single line, a plurality of lines, a dot, a plurality of dots, and a plurality of parallel light and dark lines. It can be appreciated that fringe sets and calibrating-light patterns may be projected from the same emitter module 37. This may be accomplished, for example, by spacing apart fringe set emitters and calibrating pattern emitters and passing light from them through separate areas of intensity modulating element 38.
  • In another embodiment of the present invention, a first projection set and a second projection set is projected onto a surface. A projection set comprises at least one fringe set comprising a structured-light pattern. When a projection set comprises a plurality of fringe sets, the structured-light pattern of one fringe set of the first projection set exhibits a phase-shift relative to the structured-light patterns of the other fringe sets of the first projection set. Similarly, the structured-light pattern of one fringe set of the second projection set exhibits a phase-shift relative to the structured-light patterns of other fringe sets of the second projection set. Typically, the first projection set is projected from one side of viewing optics 44 and the second projection set is projected from the other side of viewing optics 44. Depending on the configuration of detachable tip 42, a first projection set may alternatively be projected from the top of viewing optics 44 and a second projection set may be projected from the bottom of viewing optics 44, or vise versa. Even if insertion tube 40 is rotated, the first and second projection sets are projected from opposite positions or angles relative to the FOV. Therefore, the first projection set may be projected from any position or angle around viewing optics 44 that is opposite that of the second projection set.
  • Fringe sets 0, 1, and 2 of FIG. 2 comprise an exemplary projection set. In the case of FIG. 2, a plurality of fringe sets comprise the projection set. When a projection set comprises a plurality of fringe sets, the plurality of fringe sets comprising the projection set are typically projected from approximately the same origin relative to the FOV.
  • To further illustrate this, FIG. 3 shows a graph of two fringe sets. Each fringe set is a projection from opposite sides of the FOV. The fringe set in FIG. 3 represented by the solid lines projected from one side of the FOV comprises a first projection set, while the fringe set in FIG. 3 represented by the dashed lines projected from the other side of the FOV comprises a second projection set. Regarding the exemplary case of FIG. 3, only one fringe set per each projection set is shown; however a plurality of fringe sets may comprise each projection set. In an embodiment of the invention, a projection set comprises a plurality of fringe sets, each fringe set comprising a structured-light pattern, wherein the light pattern of one fringe set exhibits a phase-shift relative to the light patterns of the other fringe sets. When a first projection set and a second projection set are projected, a first image set and a second image set are captured. The first image set comprises fringe set images of the first projection set, and the second image set comprises fringe set images of the second image set, where one fringe set is projected onto the surface or object per image.
  • The probe operates in measurement mode when the at least one structured-light pattern is projected onto the surface. In an embodiment of the invention, emitter module 37 is enabled to project at least one structured-light pattern on the surface during measurement mode. During measurement mode, CPU 56 or video processor 50 captures a plurality of measurement mode images wherein the at least one structured-light pattern is projected onto the object. The measurement mode images may comprise fringe sets where no more than one fringe set is projected onto the object per measurement mode image. Measurement mode images of that sort are also referred to herein as fringe set images. Phase-shift analysis may then be performed directly on the plurality of fringe set images.
  • The probe operates in inspection mode when inspection light source 23 is enabled. Light is projected from inspection light source 23 onto a surface or object. During inspection mode, the at least one structured-light pattern may be absent. Generally, at least one image, referred to herein as an inspection mode image, is captured when light is projected from inspection light source 23 onto the viewed surface or object. Inspection light source 23 outputs relatively uniform light or diffuse illumination light from the distal end of insertion tube 40. The elements that produce and deliver light during inspection mode may collectively be referred to as an inspection light delivery system. In one embodiment, the inspection light delivery system comprises inspection light source 23, source fiber bundle 24, shutter mechanism 34, probe fiber bundle 25, and light passing element 43. In other embodiments, the inspection light delivery system may comprise very different elements such as, in the case of distally-located white LEDs, an LED drive circuit that can be disabled or provides an adjustable output current, wires for delivering power to the LEDs, the LEDs themselves, and a protective element to protect the LEDs. In another embodiment, the inspection light delivery system comprises a proximal LED coupled to a fiber bundle, which delivers light to the distal end of insertion tube 40, and an LED drive circuit.
  • Referring again to measurement mode, the intensity of light output from the inspection mode light delivery system originating from light source 23 is automatically decreased or disabled during measurement mode to avoid reducing the contrast of the at least one projected structured-light pattern. For example, CPU 56 may be configured to give an original command to turn off light source 23 electronically prior to projecting the at least one structured-light pattern through an enable/disable input to light source 23. Inspection light source 23 may then be automatically enabled, for example, electronically after a plurality of measurement mode images are captured or upon exiting measurement mode.
  • Similarly, CPU 56 may also be configured to give an original command to turn on or off light from the inspection delivery system through the use of shutter mechanism 34. Shutter mechanism 34 is configured to allow light output from the inspection light delivery system during inspection mode or regular inspection and block or otherwise inhibit light output originating from inspection light source 23 during measurement mode. Shutter mechanism 34 includes, for example, a solenoid or motor driven mechanical shutter or an electric light source disabler. When shutter mechanism 34 allows light from inspection light source 23 to pass, shutter mechanism 34 is in an open position. When shutter mechanism 34 blocks light from inspection light source 23, shutter mechanism 34 is in a closed position. During inspection mode, shutter mechanism 34 is configured to be in an open position. In contrast, during fringe set projection, shutter mechanism 34 is configured to be in a closed position. The location of shutter mechanism 34 can vary based on its implementation. In an embodiment of the invention, when shutter mechanism 34 allows light to pass, probe fiber bundle 25 delivers light to the surface or inspection site via light passing element 43.
  • Inspection light source 23 is typically a white light source, but may comprise any appropriate light source for a probe such as a mercury or metal halide arc lamp, halogen lamp, laser/phosphor system, or LED based light source which could be either proximally or distally located. When a fiber based light source is used, source fiber bundle 24 may be included in system 10. Source fiber bundle 24 comprises a non-coherent or semi-coherent fiber optic bundle and transmits light to shutter mechanism 34. Alternatively, source fiber bundle 24 may be omitted, and shutter mechanism 34 may be located directly between inspection light source 23 and probe fiber bundle 25. Probe fiber bundle 25 comprises a non-coherent fiber optic bundle. Light passing element 43 comprises a glass cane, formed fibers, and/or distribution control features such as lenses or a diffuser.
  • In some cases, projected light patterns in captured measurement mode images can be distracting and can make it more difficult for operators to see details on the viewed object. It is thus desirable to allow the operator to view a normal inspection mode image while placing measurement cursors rather than a measurement mode image that includes one or more structured-light patterns. Preferably, at least one counterpart inspection mode image and at least one counterpart measurement mode images are captured. The at least one counterpart measurement mode image comprises at least one of the plurality of measurement mode images, and the at least one counterpart inspection mode image comprises at least one inspection mode image captured in close time proximity to the at least one counterpart measurement mode image. Inspection mode images and measurement mode images captured in close proximity are referred to herein as counterpart images. Ideally, counterpart images comprise images of an object in the same position relative to the FOV.
  • Capturing counterpart images in close time proximity is advantageous at least because the relative movement between the probe's distal tip and the viewed object between the captures the counterpart images is minimized. Geometrical features, such as defects and edges, will appear in the same position in the counterpart images so that the locations of cursors positioned on an inspection-mode image correspond to the same points on the viewed object in the measurement-mode images. In an embodiment of the invention, motion detection module 53 analyzes inspection mode and measurement mode counterpart images.
  • Motion detection module 53 may be configured to analyze the images once all of the images have been captured. Alternatively, motion detection module 53 may be configured to analyze the images sequentially after the capture of each image. Motion detection module 53 is configured to automatically detect probe and/or surface movement between measurement mode images, also referred to herein as fringe set images or the images captured comprising structured-light patterns. Motion detection module 53 may be configured to compare only inspection mode images or only measurement mode images. Furthermore, motion detection module 53 may optionally be configured to compare at least one measurement mode image with its counterpart inspection mode image(s). In an embodiment of the invention, counterpart inspection mode image(s) may be captured at the beginning and/or the end of its counterpart measurement mode capture sequence.
  • Motion detection module 53 could be further configured to compare one or more captured images from each of two or more successive measurement mode capture sequences such that images having the same illumination and/or structured light patterns present may be compared rather than attempting to compensate for differences in pattern position or illumination. The term “measurement mode capture sequence” used herein is defined as the capture of a plurality of structured-light images, each captured image comprising one projected fringe set. During a measurement mode capture sequence a plurality of measurement mode images are captured.
  • Probe system 10 is configured to detect relative movement between the probe and the surface or object between the captures of two or more of a plurality of images. In an embodiment of the invention, motion detection module 53 is configured to analyze the images captured and compute a motion metric indicative of relative movement between the probe's distal tip and the surface or object between the captures of two or more of a plurality of images. These images may comprise the first and the last of a plurality of images. The first of the plurality of images is either an inspection mode image or a measurement mode image. Similarly, the last of the plurality of images is either an inspection mode image or a measurement mode image. If the motion metric indicates a high probability of movement, the capture of the plurality of images is repeated until either the motion metric indicates a low probability of movement or a pre-determined timeout occurs. For example, if the motion metric indicates a high probability of movement between counterpart measurement mode and inspection mode images, the capture of the measurement mode and inspection mode image(s) is repeated until the motion metric indicates a low probability of movement or a pre-determined timeout occurs. However, re-capture of the entire plurality of images may not always be necessary.
  • The value of the motion metric can depend upon the implementation of motion detection. The metric could be pixels of movement, in which case, the metric may be limited to a one pixel movement, for example, for indicating a low probability of movement. In that case any metric representing a movement greater than one pixel would indicate a high probability of movement. The metric limit for a low probability of movement could also be experimentally determined. Among others, one method for experimentally determining metric limits includes using a root mean square (RMS) difference between brightness values.
  • FIG. 4 is a flow chart illustrating an exemplary embodiment of the steps involved in motion detection. Borescope/endoscope or probe system 10 shown in FIG. 1 is configured to perform the steps indicated in method 400. Method 400 may be implemented when probe system 10 is in inspection mode, and the CPU 56 receives a command requesting measurement. An operator may request measurement by pressing a button (not shown) on the probe system 10 or selecting a menu item from, for example, integral display 21.
  • Once the measurement command is received, at step 402, CPU 56 or video processor 50 captures a first inspection mode image. CPU 56 then sends a command to microcontroller 30 to enter measurement mode. Microcontroller 30 controls emitter drive 32 to perform a measurement mode capture sequence. At step 404, CPU 56 or video processor 50 captures measurement mode images. At least one measurement mode image is captured per structured-light pattern or fringe set. An operator may pre-program the specifics of the measurement mode capture sequence before the implementation of method 400 by selecting a menu item from integral display 21. For example, the operator may desire the capture of a plurality of measurement mode images per structured-light pattern or fringe set. In addition, those images of the same structured-light pattern or fringe set may be captured at the same brightness level or at different brightness levels depending on the analysis and/or mapping desired.
  • After the measurement mode images are captured, emitter drive 32 is disabled by microcontroller 30, and microcontroller 30 configures DSP 51 for inspection mode. At step 406, CPU 56 or video processor 50 captures a second inspection mode image. Motion detection module 53 then analyzes the first and second inspection mode images to determine a motion metric at step 408. If the motion metric indicates an unacceptable degree of motion, and the pre-set time limit is not reached, steps 402-412 are repeated until the motion metric indicates an acceptable degree of motion or the pre-set time limit is reached. Alternatively, if the motion metric indicates an unacceptable degree of motion at step 410, and the pre-set time limit is reached at step 412, the process ends at step 99. If the motion metric indicates an acceptable degree of motion, the process ends at step 99.
  • In another embodiment of the invention, motion detection module 53 is configured to analyze the images captured based on techniques such as high-frequency detail position comparison. Points in the images that include fast transitions in brightness can be identified in the first image in the sequence, and those points can be checked in one or more subsequent images to determine whether the fast transitions still occur at the same points. This approach can accommodate differences in illumination as would exist between measurement mode and inspection mode images. Images captured under the same lighting conditions, such as inspection-mode images captured before and after the counterpart measurement-mode images, can be simply subtracted from one another to determine whether the image has substantially changed.
  • FIG. 5 is a flow chart illustrating an exemplary alternative embodiment of the steps involved in motion detection. Borescope/endoscope or probe system 10 shown in FIG. 1 is configured to perform the steps indicated in method 500. Method 500 may be implemented when probe system 10 is in inspection mode, and the CPU 56 receives a command requesting measurement. An operator may request measurement by pressing a button (not shown) on the probe 10 or selecting a menu item from, for example, integral display 21.
  • Once the measurement command is received, at step 502, CPU 56 or video processor 50 captures an inspection mode image. At step 504, the CPU 56 or video processor 50 identifies sharp brightness transition points in the inspection mode image captured at step 502. CPU 56 then sends a command to microcontroller 30 to enter measurement mode. In an embodiment of the invention, microcontroller 30 controls emitter drive 32 to illuminate one emitter group to project a first fringe set. At step 506, CPU 56 or video processor 50 captures the first fringe set images. At least one measurement mode image is captured for the first fringe set. At step 508, the CPU 56 or video processor 50 identifies sharp brightness transition points in at least one of the first fringe set images captured at step 506.
  • Motion detection module 53 then compares the identified sharp brightness transition points of the inspection mode image with those of the fringe set image(s) at step 510. At step 512 motion detection module 53 determines a motion metric based on that comparison. If the motion metric indicates an unacceptable degree of motion at step 514, and the time limit is reached at step 516, the process ends at step 99.
  • If the motion metric indicates an unacceptable degree of motion, and the pre-set time limit is not reached, steps 502-516 are repeated until the motion metric indicates an acceptable degree of motion or the pre-set time limit is reached. Preferably, the sequence is repeated from step 502 to update the inspection mode image because it is unlikely that the measurement mode images will again line up with the original inspection image. Alternatively, the sequence may be repeated from step 506 to compare the captured fringe set images or two or more measurement mode images comprising the same structured-light pattern to each other until they all match up and then capture another inspection mode image at the end.
  • However, if the motion metric indicates an acceptable degree of motion, steps 506-514 are repeated for the second fringe set, then the third fringe set, etc. The process ends after sequencing through steps 506-514 for the last fringe set fringe set once all of the fringe set images are captured for that last fringe set and the motion metric indicates an acceptable degree of motion for the fringe set image(s) in the last fringe set.
  • Referring back to FIG. 1, the previously discussed imager interface electronics 31, emitter drive 32, and shutter mechanism 34 are included in the probe electronics 48. Probe electronics 48 may be physically separated from a main control unit or CPU 56 to provide more local control over probe-related operations. Probe electronics 48 further comprise calibration memory 33. Calibration memory 33 stores information relating to the optical system of detachable tip 42 and/or elongated portion 46 such as magnification data, optical distortion data, and pattern projection geometry data.
  • Calibration memory 33 stores information relating to the intensity relationship between the light projected from light source 23 and the light projected from emitter module 37. The intensity relationship between of the light projected by light source 23 and emitter module 37 can be pre-determined before any image capture. Typically, the brightness or intensity from light source 23 is greater than the brightness or intensity from emitter module 37. Therefore, the imager 12 exposure time and/or the analog gain applied to video signal output by imager 12 during inspection mode image capture should be different from those during measurement mode image capture.
  • Microcontroller 30, which controls shutter 34, communicates with CPU 56 and controls emitter drive 32 circuitry, and also communicates with imager interface electronics 31 to determine and set gain and exposure settings, and stores and reads calibration data from the calibration memory 33.
  • Probe system 10 further comprises one or more of a gain function, an exposure function, a gamma correction function and an edge enhancement function applied to image data originating from imager 12. Probe system 10 is configured to automatically adjust the parameters of at least one of said functions when switched between inspection mode image capture and measurement mode.
  • In an embodiment of the invention, the relative intensities of the light output by the inspection light delivery system and the structured-light patterns is determined during a calibration step and stored in calibration memory 33. DSP 51 included in imager interface electronics 31 may be configured to automatically adjust imager 12 exposure and front end analog gain to achieve optimal image brightness for inspection mode image capture.
  • Microcontroller 30 is configured to compute parameters of the exposure function and gain function settings from DSP 51 to use during measurement mode using exposure function and gain function values that are active during inspection mode. Microcontroller 30 is further configured to compute parameters of the exposure and gain functions according to a pre-determined intensity relationship between the light of the structured-light patterns and the light from inspection light source 23 and set DSP 51 to apply the adjusted exposure and gain settings to optimize image brightness for measurement mode image capture. For example, the parameters of the exposure and gain functions are adjusted such that the brightness in the plurality of fringe set images is similar to the brightness in the inspection mode image(s). This approach eliminates the time that would be required for DSP 51 to reach an appropriate image brightness after the switch if DSP 51 were left in automatic exposure and gain adjustment mode, which is desirable to minimize the likelihood of motion between image captures. After the measurement-mode images are captured, DSP 51 may be again configured for automatic gain and exposure adjustment to optimize image brightness for inspection-mode image capture.
  • FIG. 6 is a flow chart illustrating an exemplary embodiment of the steps involved during an image capture sequence of the present invention. The term “image capture sequence” used herein is defined as the capture of counterpart inspection mode and measurement mode images. The term “image capture sequence” is not to be confused with the term “measurement mode capture sequence” defined above.
  • Borescope/endoscope or probe system 10 shown in FIG. 1 is configured to perform the steps indicated in method 600. Method 600 may be implemented when probe system 10 is in inspection mode, and the CPU 56 receives a command requesting measurement. An operator may request measurement by pressing a button (not shown) on the probe 10 or selecting a menu item from, for example, integral display 21.
  • Once the measurement command is received, at step 602, CPU 56 or video processor 50 captures the inspection mode image(s). At step 604, CPU 56 sends a command to microcontroller 30 to enter measurement mode. At step 606, microcontroller 30 reads the analog gain and exposure from DSP 51, and at step 608, the microcontroller 30 adjusts the gain and exposure for measurement mode. Discussed above, the measurement mode DSP 51 settings may be adjusted according to a predetermined intensity relationship between the inspection light delivery system and the structured light patterns. Further at step 608, microcontroller 30 sets DSP 51 to fixed gain and exposure based on the adjusted values. At step 610, the inspection light is disabled by microcontroller 30 or CPU 56 as discussed previously.
  • At step 612, microcontroller 30 controls emitter drive 32 to perform a measurement mode capture sequence. In an embodiment of the invention, performing a measurement mode capture sequence comprises sequencing through emitter groups, different subsets of light emitters, on frame boundaries while possibly adjusting on time or drive level to compensate for different emitter brightness levels. Furthermore, a drive level supplied to one subset light emitters may be adjusted to compensate for a temperature difference between that subset of light emitters and another subset of light emitters. Different emitter brightness levels may be due to differing emitter efficiencies or to heating of the emitters as the sequence progresses. For example, if the emitters are LEDs, the efficiency generally decreases as temperature increases. When the first LED is turned on, emitter module 37 is cooler than when the last LED is turned on. Thus, the last LED requires more drive current to achieve the same output as the first LED. The difference in drive levels may be predetermined through a calibration step. LED forward voltage drop also typically increases as temperature increases. Thus, microcontroller 30 in conjunction with emitter drive 32 may measure the LED forward drop to determine LED temperature to more accurately compensate for the efficiency change.
  • At step 614, CPU 56 or video processor 50 captures measurement mode images. At least one measurement mode image is captured per fringe set. In addition, a plurality of measurement mode images may be captured per fringe set such that measurement mode images of each fringe set are captured at the same brightness level; also, a plurality of measurement mode images may be captured per fringe set such that the plurality of fringe set images of at least one fringe set are captured at different brightness levels.
  • Motion detection module 53 analyzes the images for motion at step 616. If motion is detected at step 618, and the pre-set time limit is reached at step 620, the process ends at step 99. If motion is detected, and the pre-set time limit is not reached, steps 612-620 are repeated until motion is not detected or the pre-set time limit is reached. Alternatively, if motion is not detected at step 618, CPU 56 sends a command to microcontroller 30 to enter inspection mode. At step 624, emitter drive 32 is disabled by microcontroller 30. At step 626, microcontroller 30 configures DSP 51 for inspection mode by setting DSP 51 for automatic gain and exposure adjustment. At step 628, CPU 56 or microcontroller 30 enables inspection light output. After step 628, CPU 56 or video processor 50 may again capture inspection mode image(s), as in step 402. This marks the end of the image capture sequence. Method 600 may be repeated automatically to sequence through the steps a pre-determined number of times. Alternatively, an operator may manually command the repetition of method 600 by requesting measurement each time a new image capture sequence is desired.
  • Referring back to step 618 of FIG. 6, probe system 10 does not have to directly enter inspection mode if there is no motion detected at step 618. In another embodiment of the invention, if there is no motion detected at step 618, the user is given an option to either enter inspection mode at step 622 or to enter a measurement screen (not shown). The measurement screen displays a counterpart inspection mode image, preferably captured from step 602, while analysis or measurement is performed on the at least one counterpart measurement mode image, preferably captured from step 614. The measurement screen enables the placement of measurement cursors on the counterpart inspection mode image while the actual analysis or measurement is performed on data representing the at least one counterpart measurement mode image. Optionally, when entering the measurement screen, the emitter drive is disabled so that structured-patterns are not projected. The user can choose to enter inspection mode at any point desired while viewing the measurement screen to pick up at step 622. If the emitter drive was previously disabled from entering the measurement screen, step 624 is skipped. The sequence resumes at step 626, where microcontroller 30 configures DSP 51 for inspection mode by setting DSP 51 for automatic gain and exposure adjustment.
  • Probe system 10 is configured to change the parameters of imager 12 analog gain and exposure functions through DSP 51 when switched between inspection mode and measurement mode. Probe system 10 is also configured to automatically adjust other processing parameters of DSP 51, including, but not limited to, gamma correction and edge enhancement, when switched between inspection mode and measurement mode.
  • Regarding gamma correction, typically, imager 12 responds to light in a linear manner. A non-linear re-mapping of the intensity or luminance values is often performed by the DSP 51 to improve the perceived brightness uniformity for image display. Non-linear re-mapping of the intensity values of images captured during inspection mode may be desirable. However, it is preferable to perform phase-shift analysis on images representative of a linear response to light. Therefore, the linear response to light must be carried over from imager 12 to video processor 50 during measurement mode because phase-shift analysis is generally performed on the structured-light images captured during measurement mode. The probe system 10 is configured to decrease an effective level of gamma correction applied to pixels of at least one measurement mode image relative to the level of gamma correction applied during inspection mode. For example, a linear gamma DSP setting is enabled or switched on during measurement mode. During inspection mode, however, the linear gamma DSP setting is typically disabled, or set to be non-linear, to improve the perceived inspection-mode image quality.
  • Regarding edge enhancement, enabling edge enhancement artificially modifies the brightness linearity of an image. This is generally not desirable for images on which phase-shift analysis is performed as images representative of a linear response to light are preferred. Therefore, the edge enhancement function is disabled or switched off for measurement mode image capture, and may be enabled or switched on for inspection mode viewing and inspection mode image capture. Probe system 10 is configured to reduce an effective level of edge enhancement applied to pixels of at least one measurement mode image relative to the level of edge enhancement applied during inspection mode.
  • Further relating to image capture, it is preferable to perform phase-shift analysis on images with little to no random noise to improve measurement accuracy. To reduce random noise, probe system 10 may be configured to capture a plurality of measurement mode images with the same structured-light pattern or same fringe set present and average or sum two or more of those measurement mode images. The result is a plurality of composite images where only one fringe set is present in each composite image. These plurality of measurement mode images with the same structured-light pattern or same fringe set present may be captured with the same or similar brightness levels. When the composite image of a projected fringe set a result of summing rather than averaging, the dynamic range is increased and noise is reduced. Whether the composite images are a result of summing or averaging, phase-shift analysis and other processes can be performed on the composite images.
  • Furthermore, measurement mode images comprising the same structured-light pattern or same projected fringe set may be captured with different brightness or intensity levels. This is accomplished by either manually or automatically changing the emitter output intensity or duration, imager 12 exposure, analog gain, or some combination thereof.
  • Discussed above, fringe contrast determining function 54 is configured to determine whether one emitter or multiple emitters should be enabled for each emitter group. In order to change the light source intensity, for example, fringe contrast determining function 54 may further be configured to sequence between enabling one emitter and multiple emitters per emitter group to project light. Also discussed above, microcontroller 30 communicates with imager interface electronics 31 to determine and set gain and exposure settings. Similarly, microcontroller 30 may configure emitter drive 32 to alter the amount of power delivered to emitter module 37 and thus vary the intensities of the projected fringe sets.
  • A plurality of measurement mode images captured with different brightness or intensity levels comprising the same projected fringe set may be combined to effectively increase the dynamic range of the system. For example, an image may be captured of a shiny metal surface comprising a structured-light pattern. Reflective properties of the shiny metal surface may prevent adequate intensity levels in dark areas. Therefore, multiple images for each fringe set may be captured with different intensity levels so that, in at least some images, the dark areas are properly illuminated. Then, the captured images for each fringe set may be combined resulting in a single image for each fringe set with sufficient intensity levels across a larger portion of the image than could be achieved with a single image per fringe set.
  • CPU 56 or video processor 50 may be configured to analyze an inspection-mode image prior to capturing the measurement-mode images to determine brightness uniformity. The analysis may be performed by generating and evaluating a histogram of pixel luminance values. A histogram having most of the pixels in a middle brightness range would indicate that a single set of measurement-mode images would likely be adequate. A histogram having mostly very bright pixels and very dark pixels may indicate a highly reflective surface that would benefit from the merging of multiple measurement-mode image sequences captured at different brightness levels.
  • In an embodiment of the invention, multiple measurement mode images that comprise the same projected structured-light pattern or fringe set and are captured with different intensity levels are combined by evaluating each pixel intensity in the multiple images and selecting the set of pixels that best meets a set of criteria, such as maximum modulation, brightness values, or pixel unsaturation. Pixel values from more than one measurement mode image comprising the same structured-light pattern or same fringe set may be utilized to determine at least one geometric dimension of the surface or object.
  • The discussion above generally relates to image capture by probe system 10. The discussion below generally relates to the use and storage of those captured images.
  • In an embodiment of the invention, image(s) comprising structured-light patterns captured during measurement mode are hidden from the operator, and are used only for the actual analysis and measurement. For example, CPU 56 or video processor 50 creates an image file comprising data representing a counterpart inspection mode image and creates a hidden record within the image file comprising data representing its at least one counterpart measurement mode image. The counterpart inspection mode image can be displayed while the operator positions overlay cursors for determining geometric measurements of a viewed object while analysis is performed on the counterpart measurement mode image to determine the geometric measurement results. Therefore, the operator may command analysis or measurement while viewing an inspection mode image even though the analysis or measurement is performed on its counterpart measurement mode image or images. This can also be done prior to image storage, for example, as previously discussed in relation to FIG. 6, where the inspection mode image is displayed and the measurement images are processed.
  • The operator skill requirement is reduced by allowing the user to place cursors on a normal, inspection mode, image without having to worry about stereo matching, perpendicularity, shadow position, etc. The operator may use joystick 62 to place cursors on an inspection mode image displayed on integral display 21. Interchangeably with joystick 62, keypad 64 and/or computer I/O interface 66 may also be used to place cursors, and interchangeably with integral display 21, computer monitor 22 and/or video monitor 20 may also be used to display the inspection mode image.
  • Typically, when a measurement is performed on an image, and the image is saved, graphical overlay data is added to the image such as cursor positions, measurement results, accuracy indicators, etc. This graphical overlay must be removable to enable easy re-measurement at a later time where the non-graphical data related to the image is required. A method for storing image-specific data from a probe, specifically calibration data, is known from U.S. Pat. No. 7,262,797. It is further desirable to store measurement data, for example, relating to phase-shift analysis. Measurement data includes, but is not limited to the luminance portion of a structured-light image (luminance data), measurement cursor positions, merged image data, measurement types, results, accuracy indications, etc., calibration data, phase data, and phase-shift analysis data. Phase data may include data representing the phases of the structured-line patterns, for example wrapped phase, unwrapped phase, relative phase, and/or absolute phase. The co-pending application entitled Phase-Shift Analysis System and Method filed on Mar. 5, 2008 as U.S. Ser. No. 12/042,800, which is incorporated herein by reference, discusses the implementation of phase data in phase-shift analysis. Phase-shift analysis data includes, but is not limited to, surface or object distance data (z values for each pixel, where z is the object distance from the probe), and point cloud data (x, y, z values for each pixel). As one skilled in the art should understand, the method disclosed in U.S. Pat. No. 7,262,797 should include the additional step of “writing measurement data to file.”
  • Specifically, the system may include a geometric measurement mode in which a counterpart inspection mode image is displayed, an operator positions measurement cursors on the inspection mode image to identify measurement points, and CPU 56 computes and displays measurement results based on 3D data derived through phase-shift analysis performed on the counterpart measurement images utilizing calibration data. When the operator requests an image save, CPU 56 creates an image file. It merges overlay data with the inspection image data and saves the result in the image file such that when the file is opened by a standard image viewer, the inspection mode image and overlay are displayed. CPU 56 also creates hidden records in the image file. In these hidden records, it stores the inspection image data that was overwritten by the overlay data, referred to as overlay-replacement data, and measurement data. Thus, a custom software application can fully recover the original inspection mode image and has all the information needed to replicate the existing measurements and/or perform additional measurements or analysis.
  • Measurement data in bitmap and JPEG images captured using probe system 10 or an accompanying personal computer application is saved by CPU 56 or video processor 50. This allows images to have “destructive” overlays that are visible in the image using standard image viewing software, but which are removable by a custom application to present a clean image to the viewer or operator. The clean image can either be a measurement mode image or its counterpart inspection mode image. Storing luminance and measurement data in the image also allows the measurements to be repeated on the image using either the probe software or by a custom program, such as a PC-based software package.
  • Once images are captured and, optionally, stored by probe system 10, the image data can be used in many ways. For example, pixel values from the measurement mode images can be used to determine at least one geometric dimension of the object or surface. In addition, image data can be used for performing 3D geometric measurements or 3D visualization. The image data can also be exported or converted to a data format usable with 3D modeling software for detailed analysis or reverse engineering.
  • The construction and arrangement of systems and methods relating to image capture, as described herein and shown in the appended figures, is illustrative only and is not limited to a probe. Although only a few embodiments of the invention have been described in detail in this disclosure, those skilled in the art who review this disclosure will readily appreciate that many modifications are possible (e.g. variations in sizes, dimensions, structures, shapes and proportions of the various elements, values of parameters, mounting arrangements, use of materials, orientations, etc.) without materially departing from the novel teachings and advantages of the subject matter recited in the appended claims. Accordingly, all such modifications are intended to be included within the scope of the present invention as defined in the appended claims. The order or sequence of any process or method steps may be varied or re-sequenced according to alternative embodiments. In the claims, any means-plus-function clause is intended to cover the structures described herein as performing the recited function and not only structural equivalents but also equivalent structures. Other substitutions, modifications, changes and omissions may be made in the design, operating conditions and arrangement of the preferred and other exemplary embodiments without departing from the spirit of the embodiments of the invention as expressed in the appended claims. Therefore, the technical scope of the present invention encompasses not only those embodiments described above, but also those that fall within the scope of the appended claims.

Claims (42)

1. A probe system comprising an imager and an inspection light source, the probe system configured to:
operate in an inspection mode and a measurement mode, wherein the inspection light source is enabled during inspection mode and disabled during measurement mode, and wherein a structured-light pattern is projected during measurement mode;
capture at least one measurement mode image wherein the structured-light pattern is projected onto an object; and
utilize pixel values from the at least one measurement mode image to determine at least one geometric dimension of the object.
2. The probe system of claim 1, wherein:
the structured-light pattern comprises parallel light and dark lines and the parallel light and dark lines comprise sinusoidal intensity profiles.
3. The probe system of claim 1 wherein the structured-light pattern comprises a fringe set; and wherein no more than one fringe set is captured in each of the at least one measurement mode images.
4. The probe system of claim 3, wherein:
the structured-light pattern of one fringe set exhibits a phase-shift relative to structured-light patterns of other fringe sets.
5. The probe system of claim 1, further configured to project at least one calibrating-light pattern onto the object.
6. The probe system of claim 1, further comprising:
a shutter mechanism configured to disable the inspection light source during measurement mode by blocking light from the inspection light source.
7. The probe system of claim 1, further configured to:
automatically enable the inspection light source upon exiting measurement mode.
8. The probe system of claim 1, further configured to:
create a single image file comprising a viewable image wherein at least one measurement cursor has been destructively overlaid; and
create at least one hidden record wherein the viewable image and at least one measurement cursor are viewable using image viewer software, the at least one hidden record comprising at least one of overlay replacement data, measurement cursor positions, calibration data, structured light image data, phase data, point cloud data, and object distance data such that a custom program can perform geometric measurements on the viewable image.
9. The probe system of claim 1, further configured to:
capture at least one inspection mode image wherein light from the inspection light source is projected onto the object.
10. The probe system of claim 1, further configured to:
capture a counterpart inspection mode image and at least one counterpart measurement mode image;
wherein the at least one counterpart measurement mode image comprises at least one of the at least one measurement mode images; and
wherein the counterpart inspection image comprises at least one inspection mode image captured in close time proximity to the at least one counterpart measurement mode image, the at least one inspection mode image comprising light from the inspection light source projected onto the object.
11. The probe system of claim 10, further configured to:
enter a measurement screen wherein the counterpart inspection mode image is displayed which enables cursors to be placed for measurement to determine the at least one geometric dimension of the object.
12. The probe system of claim 1, further comprising one or more of a gain function, an exposure function, a gamma correction function, and an edge enhancement function applied to image data originating from the imager;
wherein the probe system is configured to automatically adjust the parameters of at least one of said functions when switched between inspection mode and measurement mode.
13. The probe system of claim 12, further configured to:
compute parameters of the exposure function and the gain function to use during measurement mode using exposure function values and gain function values that are active during inspection mode.
14. The probe system of claim 12, further configured to:
compute parameters of the exposure function and the gain function to use during measurement mode using a pre-determined intensity relationship between light of the structured-light pattern and light from the inspection light source.
15. The probe system of claim 12, further configured to:
decrease an effective level of gamma correction applied to pixels of the at least one measurement mode image relative to the level of gamma correction applied during inspection mode.
16. The probe system of claim 12, further configured to:
reduce an effective level of edge enhancement applied to pixels of the at least one measurement mode image relative to the level of edge enhancement applied during inspection mode.
17. The probe system of claim 1, wherein the at least one measurement mode image comprises a plurality of measurement mode images comprising a same structured-light pattern; and
wherein pixel values from more than one image of the plurality of measurement mode images comprising the same structured-light pattern are utilized to determine the at least one geometric dimension of the object.
18. The probe system of claim 17, wherein the plurality of measurement mode images comprising the same structured-light pattern are captured with different brightness levels.
19. The probe system of claim 17, wherein the plurality of measurement mode images comprising the same structured-light pattern are captured with similar brightness levels.
20. The probe system of claim 17, further comprising:
generating at least one composite image comprising two or more of the plurality of measurement mode images comprising the same structured-light pattern.
21. The probe system of claim 1, further comprising:
a plurality of light emitters, wherein a different subset of the plurality of light emitters emits light to project each of a plurality of structured-light patterns, the light emitter output intensity varying with emitter temperature, and a drive level supplied to one subset of the plurality of light emitters adjusting to compensate for a temperature difference between that subset of the plurality of light emitters and another subset of the plurality of light emitters.
22. A probe system comprising an imager, the probe system configured to:
operate in an inspection mode and a measurement mode, wherein diffuse illumination light is projected during inspection mode, and wherein a structured-light pattern is projected during measurement mode;
capture at least one measurement mode image wherein the structured-light pattern is projected onto an object;
utilize pixel values from the at least one measurement mode image to determine at least one geometric dimension of the object; and
detect relative movement between a probe and the object between captures of two or more of a plurality of images.
23. The probe system of claim 22, wherein:
the structured-light pattern comprises parallel light and dark lines and the parallel light and dark lines comprise sinusoidal intensity profiles.
24. The probe system of claim 22, wherein:
the structured-light pattern comprises a fringe set, and wherein no more than one fringe set is captured in each of the at least one measurement mode images.
25. The probe system of claim 24, wherein:
the structured-light pattern of one fringe set exhibits a phase-shift relative to structured-light patterns of other fringe sets.
26. The probe system of claim 22, further configured to project at least one calibrating-light pattern onto the object.
27. The probe system of claim 22, further comprising:
computing a motion metric indicative of the relative movement between the probe and the object between the captures of the two or more of the plurality of images.
28. The probe system of claim 27, further configured to:
repeat the capture of at least one of the plurality of images if the motion metric indicates a high probability of relative movement until either the motion metric indicates a low probability of relative movement or until a pre-determined timeout occurs.
29. The probe system of claim 22, wherein:
the two or more of the plurality of images comprises at least one inspection mode image captured before the capture of the at least one measurement mode image and at least one inspection mode image captured after the capture of the at least one measurement mode image, the at least one inspection mode image comprising diffuse illumination light projected onto the object.
30. The probe system of claim 22, wherein:
the at least one measurement mode image comprises a plurality of measurement mode images comprising a same structured-light pattern; and
wherein the two or more of the plurality of images comprises two or more of the plurality of measurement mode images comprising the same structured-light pattern.
31. The probe system of claim 22, wherein:
the diffuse illumination light is inhibited during the capture of the at least one measurement mode image.
32. The probe system of claim 22, further configured to:
capture a counterpart inspection mode image and at least one counterpart measurement mode image;
wherein the at least one counterpart measurement mode image comprises at least one of the at least one measurement mode images; and
wherein the counterpart inspection image comprises at least one inspection mode image captured in close time proximity to the at least one counterpart measurement mode image, the at least one inspection mode image comprising diffuse illumination light projected onto the object.
33. The probe system of claim 32, further configured to:
enter a measurement screen wherein the counterpart inspection mode image is displayed which enables cursors to be placed for measurement to determine the at least one geometric dimension of the object.
34. The probe system of claim 22, further comprising one or more of a gain function, an exposure function, a gamma correction function, and an edge enhancement function applied to image data originating from the imager;
wherein the probe system is configured to automatically adjust the parameters of at least one of said functions when switched between inspection mode and measurement mode.
35. The probe system of claim 34, further configured to:
compute parameters of the exposure function and the gain function to use during measurement mode using exposure function values and gain function values that are active during inspection mode.
36. The probe system of claim 34, further configured to:
compute parameters of the exposure function and the gain function to use during measurement mode using a pre-determined intensity relationship between light of the structured-light pattern and the diffuse illumination light.
37. The probe system of claim 34, further configured to:
decrease an effective level of gamma correction applied to pixels of the at least one measurement mode image relative to the level of gamma correction applied during inspection mode.
38. The probe system of claim 34, further configured to:
reduce an effective level of edge enhancement applied to pixels of the at least one measurement mode image relative to the level of edge enhancement applied during inspection mode.
39. The probe system of claim 22, wherein the at least one measurement mode image comprises a plurality of measurement mode images comprising a same structured-light pattern; and
wherein pixel values from more than one image of the plurality of measurement mode images comprising the same structured-light pattern are utilized to determine the at least one geometric dimension of the object.
40. The probe system of claim 39, wherein the plurality of measurement mode images comprising the same structured-light pattern are captured with different brightness levels.
41. The probe system of claim 40, wherein the plurality of measurement mode images captured with different brightness levels are combined by choosing image pixels based on one of brightness values and modulation values.
42. The probe system of claim 39, wherein the plurality of measurement mode images comprising the same structured-light pattern are captured with similar brightness levels.
US12/249,513 2008-03-05 2008-10-10 System aspects for a probe system that utilizes structured-light Active 2029-02-12 US8107083B2 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
US12/249,513 US8107083B2 (en) 2008-03-05 2008-10-10 System aspects for a probe system that utilizes structured-light
EP09172042.5A EP2175231B1 (en) 2008-10-10 2009-10-02 System aspects for a probe system that utilizes structured-light
CN2009102065334A CN101726263B (en) 2008-10-10 2009-10-10 Probe system that utilizes structured-light
US13/100,826 US8422030B2 (en) 2008-03-05 2011-05-04 Fringe projection system with intensity modulating by columns of a plurality of grating elements
US13/334,239 US8976363B2 (en) 2008-10-10 2011-12-22 System aspects for a probe system that utilizes structured-light

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US12/042,821 US7821649B2 (en) 2008-03-05 2008-03-05 Fringe projection system and method for a probe suitable for phase-shift analysis
US12/249,513 US8107083B2 (en) 2008-03-05 2008-10-10 System aspects for a probe system that utilizes structured-light

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US12/042,821 Continuation-In-Part US7821649B2 (en) 2008-03-05 2008-03-05 Fringe projection system and method for a probe suitable for phase-shift analysis

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US13/100,826 Continuation-In-Part US8422030B2 (en) 2008-03-05 2011-05-04 Fringe projection system with intensity modulating by columns of a plurality of grating elements
US13/334,239 Continuation US8976363B2 (en) 2008-10-10 2011-12-22 System aspects for a probe system that utilizes structured-light

Publications (2)

Publication Number Publication Date
US20090225333A1 true US20090225333A1 (en) 2009-09-10
US8107083B2 US8107083B2 (en) 2012-01-31

Family

ID=41581113

Family Applications (2)

Application Number Title Priority Date Filing Date
US12/249,513 Active 2029-02-12 US8107083B2 (en) 2008-03-05 2008-10-10 System aspects for a probe system that utilizes structured-light
US13/334,239 Active 2030-04-03 US8976363B2 (en) 2008-10-10 2011-12-22 System aspects for a probe system that utilizes structured-light

Family Applications After (1)

Application Number Title Priority Date Filing Date
US13/334,239 Active 2030-04-03 US8976363B2 (en) 2008-10-10 2011-12-22 System aspects for a probe system that utilizes structured-light

Country Status (3)

Country Link
US (2) US8107083B2 (en)
EP (1) EP2175231B1 (en)
CN (1) CN101726263B (en)

Cited By (87)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090225320A1 (en) * 2008-03-05 2009-09-10 Clark Alexander Bendall Fringe Projection System and Method for a Probe using a Coherent Fiber Bundle
US20110205552A1 (en) * 2008-03-05 2011-08-25 General Electric Company Fringe projection system for a probe with intensity modulating element suitable for phase-shift analysis
US20110210961A1 (en) * 2010-02-26 2011-09-01 Clark Alexander Bendall Method of determining the profile of a surface of an object
GB2481459A (en) * 2010-06-25 2011-12-28 Fraunhofer Ges Forschung Capturing A Surface Structure Of An Object Surface
US20120019653A1 (en) * 2010-07-21 2012-01-26 Olympus Corporation Inspection apparatus and measurement method
US20120041267A1 (en) * 2010-08-10 2012-02-16 Christopher Benning Endoscopic system for enhanced visualization
FR2965388A1 (en) * 2010-09-29 2012-03-30 Tokendo Method for estimating wear and dimensions of defects of mechanical element i.e. engine of e.g. helicopter, involves introducing digitized model into image processing device to acquire video-endoscopic image of inspected mechanical element
WO2012155237A1 (en) * 2011-05-16 2012-11-22 National Research Council Of Canada High resolution high contrast edge projection
CN102906536A (en) * 2010-05-19 2013-01-30 株式会社尼康 Shape measuring device and shape measuring method
US8411083B2 (en) 2011-04-06 2013-04-02 General Electric Company Method and device for displaying an indication of the quality of the three-dimensional data for a surface of a viewed object
US20130287288A1 (en) * 2012-04-25 2013-10-31 General Electric Company Method and device for determining the offset distance between two surfaces
EP2689708A1 (en) * 2011-04-27 2014-01-29 Olympus Corporation Endoscopic apparatus and measurement method
US8704890B2 (en) * 2010-08-19 2014-04-22 Olympus Corporation Inspection apparatus and measuring method
CN103868471A (en) * 2012-12-12 2014-06-18 佳能株式会社 Three-dimensional shape measuring apparatus and control method thereof
US20140207403A1 (en) * 2013-01-22 2014-07-24 General Electric Company Inspection instrument auto-configuration
US20140275764A1 (en) * 2013-03-13 2014-09-18 John T. SHEN System for obtaining clear endoscope images
US20140293038A1 (en) * 2013-03-28 2014-10-02 General Electric Company Methods and devices for adjusting brightness of a light source
US20140333778A1 (en) * 2013-05-13 2014-11-13 General Electric Company Automated borescope measurement tip accuracy test
US8976249B2 (en) 2011-11-04 2015-03-10 Empire Technology Development Llc IR signal capture for images
US9013469B2 (en) 2011-03-04 2015-04-21 General Electric Company Method and device for displaying a three-dimensional view of the surface of a viewed object
US9179844B2 (en) 2011-11-28 2015-11-10 Aranz Healthcare Limited Handheld skin measuring or monitoring device
US20150348253A1 (en) * 2014-05-30 2015-12-03 General Electric Company Remote visual inspection image capture system and method
US20160025653A1 (en) * 2013-03-15 2016-01-28 Vidtek Associates NV, Inc. Borescope apparatus and a method of using same
EP3012579A1 (en) * 2014-10-21 2016-04-27 Hand Held Products, Inc. System and method for dimensioning
US9412189B2 (en) 2013-05-13 2016-08-09 General Electric Company Method and system for detecting known measurable object features
US9464885B2 (en) 2013-08-30 2016-10-11 Hand Held Products, Inc. System and method for package dimensioning
US9557166B2 (en) 2014-10-21 2017-01-31 Hand Held Products, Inc. Dimensioning system with multipath interference mitigation
US9581802B2 (en) 2011-05-24 2017-02-28 Olympus Corporation Endoscope device, and measurement method
US9600928B2 (en) 2013-12-17 2017-03-21 General Electric Company Method and device for automatically identifying a point of interest on the surface of an anomaly
US9622644B2 (en) 2011-05-24 2017-04-18 Olympus Corporation Endoscope
US20170116462A1 (en) * 2015-10-22 2017-04-27 Canon Kabushiki Kaisha Measurement apparatus and method, program, article manufacturing method, calibration mark member, processing apparatus, and processing system
JP2017083419A (en) * 2015-10-22 2017-05-18 キヤノン株式会社 Measurement device and method, article manufacturing method, calibration mark member, processing device, and processing system
US9703005B2 (en) * 2015-11-30 2017-07-11 Jp3 Measurement, Llc Downhole sensing via swept source lasers
US9752864B2 (en) 2014-10-21 2017-09-05 Hand Held Products, Inc. Handheld dimensioning system with feedback
US9779276B2 (en) 2014-10-10 2017-10-03 Hand Held Products, Inc. Depth sensor based auto-focus system for an indicia scanner
US9779546B2 (en) 2012-05-04 2017-10-03 Intermec Ip Corp. Volume dimensioning systems and methods
US9784566B2 (en) 2013-03-13 2017-10-10 Intermec Ip Corp. Systems and methods for enhancing dimensioning
US9786101B2 (en) 2015-05-19 2017-10-10 Hand Held Products, Inc. Evaluating image values
US9818039B2 (en) 2013-12-17 2017-11-14 General Electric Company Method and device for automatically identifying a point of interest in a depth measurement on a viewed object
US9823059B2 (en) 2014-08-06 2017-11-21 Hand Held Products, Inc. Dimensioning system with guided alignment
US9835486B2 (en) 2015-07-07 2017-12-05 Hand Held Products, Inc. Mobile dimensioner apparatus for use in commerce
US9842430B2 (en) 2013-12-17 2017-12-12 General Electric Company Method and device for automatically identifying a point of interest on a viewed object
US9841311B2 (en) 2012-10-16 2017-12-12 Hand Held Products, Inc. Dimensioning system
US20170366773A1 (en) * 2016-06-21 2017-12-21 Siemens Aktiengesellschaft Projection in endoscopic medical imaging
US9857167B2 (en) 2015-06-23 2018-01-02 Hand Held Products, Inc. Dual-projector three-dimensional scanner
US9875574B2 (en) 2013-12-17 2018-01-23 General Electric Company Method and device for automatically identifying the deepest point on the surface of an anomaly
US20180045510A1 (en) * 2016-08-12 2018-02-15 General Electric Company Probe System and Method
US9897434B2 (en) 2014-10-21 2018-02-20 Hand Held Products, Inc. Handheld dimensioning system with measurement-conformance feedback
US9913573B2 (en) 2003-04-01 2018-03-13 Boston Scientific Scimed, Inc. Endoscopic imaging system
US9940721B2 (en) 2016-06-10 2018-04-10 Hand Held Products, Inc. Scene change detection in a dimensioner
US9939259B2 (en) 2012-10-04 2018-04-10 Hand Held Products, Inc. Measuring object dimensions using mobile computer
US9984474B2 (en) 2011-03-04 2018-05-29 General Electric Company Method and device for measuring features on or near an object
US10007858B2 (en) 2012-05-15 2018-06-26 Honeywell International Inc. Terminals and methods for dimensioning objects
US10010268B2 (en) 2010-09-15 2018-07-03 Olympus Corporation Endoscope apparatus
US10019812B2 (en) 2011-03-04 2018-07-10 General Electric Company Graphic overlay for measuring dimensions of features using a video inspection device
US10025314B2 (en) 2016-01-27 2018-07-17 Hand Held Products, Inc. Vehicle positioning and object avoidance
US10060729B2 (en) 2014-10-21 2018-08-28 Hand Held Products, Inc. Handheld dimensioner with data-quality indication
US10066982B2 (en) 2015-06-16 2018-09-04 Hand Held Products, Inc. Calibrating a volume dimensioner
US10094650B2 (en) 2015-07-16 2018-10-09 Hand Held Products, Inc. Dimensioning and imaging items
EP3274653A4 (en) * 2015-03-22 2018-11-14 Facebook Inc. Depth mapping with a head mounted display using stereo cameras and structured light
US10134120B2 (en) 2014-10-10 2018-11-20 Hand Held Products, Inc. Image-stitching for dimensioning
US10140724B2 (en) 2009-01-12 2018-11-27 Intermec Ip Corporation Semi-automatic dimensioning with imager on a portable device
US10157495B2 (en) 2011-03-04 2018-12-18 General Electric Company Method and device for displaying a two-dimensional image of a viewed object simultaneously with an image depicting the three-dimensional geometry of the viewed object
US10163216B2 (en) 2016-06-15 2018-12-25 Hand Held Products, Inc. Automatic mode switching in a volume dimensioner
US10203402B2 (en) 2013-06-07 2019-02-12 Hand Held Products, Inc. Method of error correction for 3D imaging device
US10225544B2 (en) 2015-11-19 2019-03-05 Hand Held Products, Inc. High resolution dot pattern
US10247547B2 (en) 2015-06-23 2019-04-02 Hand Held Products, Inc. Optical pattern projector
US10249030B2 (en) 2015-10-30 2019-04-02 Hand Held Products, Inc. Image transformation for indicia reading
US10321127B2 (en) 2012-08-20 2019-06-11 Intermec Ip Corp. Volume dimensioning system calibration systems and methods
US10339352B2 (en) 2016-06-03 2019-07-02 Hand Held Products, Inc. Wearable metrological apparatus
US10393506B2 (en) 2015-07-15 2019-08-27 Hand Held Products, Inc. Method for a mobile dimensioning device to use a dynamic accuracy compatible with NIST standard
US10584962B2 (en) 2018-05-01 2020-03-10 Hand Held Products, Inc System and method for validating physical-item security
US10586341B2 (en) 2011-03-04 2020-03-10 General Electric Company Method and device for measuring features on or near an object
US20200241045A1 (en) * 2019-01-24 2020-07-30 Rohde & Schwarz Gmbh & Co. Kg Probe, measuring system and method for applying a probe
US10733748B2 (en) 2017-07-24 2020-08-04 Hand Held Products, Inc. Dual-pattern optical 3D dimensioning
US10777317B2 (en) 2016-05-02 2020-09-15 Aranz Healthcare Limited Automatically assessing an anatomical surface feature and securely managing information related to the same
US10775165B2 (en) 2014-10-10 2020-09-15 Hand Held Products, Inc. Methods for improving the accuracy of dimensioning-system measurements
US10827970B2 (en) 2005-10-14 2020-11-10 Aranz Healthcare Limited Method of monitoring a surface feature and apparatus therefor
US10909708B2 (en) 2016-12-09 2021-02-02 Hand Held Products, Inc. Calibrating a dimensioner using ratios of measurable parameters of optic ally-perceptible geometric elements
US11029762B2 (en) 2015-07-16 2021-06-08 Hand Held Products, Inc. Adjusting dimensioning results using augmented reality
US20210172732A1 (en) * 2019-12-09 2021-06-10 Industrial Technology Research Institute Projecting apparatus and projecting calibration method
US11047672B2 (en) 2017-03-28 2021-06-29 Hand Held Products, Inc. System for optically dimensioning
US11116407B2 (en) 2016-11-17 2021-09-14 Aranz Healthcare Limited Anatomical surface assessment methods, devices and systems
WO2021256953A1 (en) * 2020-06-18 2021-12-23 Общество С Ограниченной Ответственностью "Турбоскан" Method and device for monitoring the shape of hard-to-reach components
US11639846B2 (en) 2019-09-27 2023-05-02 Honeywell International Inc. Dual-pattern optical 3D dimensioning
CN116105632A (en) * 2023-04-12 2023-05-12 四川大学 Self-supervision phase unwrapping method and device for structured light three-dimensional imaging
US11903723B2 (en) 2017-04-04 2024-02-20 Aranz Healthcare Limited Anatomical surface assessment methods, devices and systems

Families Citing this family (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8107083B2 (en) * 2008-03-05 2012-01-31 General Electric Company System aspects for a probe system that utilizes structured-light
EP2272417B1 (en) * 2009-07-10 2016-11-09 GE Inspection Technologies, LP Fringe projection system for a probe suitable for phase-shift analysis
CN101957496B (en) * 2009-07-17 2014-12-17 通用电气检查技术有限合伙人公司 System and method for projecting fringes suitable for phase shift analysis by utilizing probe
US8165351B2 (en) * 2010-07-19 2012-04-24 General Electric Company Method of structured light-based measurement
US20120107780A1 (en) * 2010-10-28 2012-05-03 Olympus Corporation Inspection apparatus and inspection method
EP2520217B1 (en) * 2011-05-04 2020-08-12 General Electric Company Fringe projection system for a probe with intensity modulating element suitable for phase-shift analysis, and intensity modulating element
CN102508357A (en) * 2011-11-21 2012-06-20 南京春辉科技实业有限公司 Endoscope capable of electrically controlling bend angle
US8786300B2 (en) * 2012-02-07 2014-07-22 General Electric Company Probe assembly and methods for use in inspecting a component
US20130296712A1 (en) * 2012-05-03 2013-11-07 Covidien Lp Integrated non-contact dimensional metrology tool
CN103323463B (en) * 2012-06-18 2015-10-21 戚景赞 The automatic testing method that a kind of high purity carbon fiberreinforced composite conductor core characterizes and system
US20140142383A1 (en) * 2012-11-22 2014-05-22 Gyrus Acmi, Inc. (D.B.A. Olympus Surgical Technologies America) Endoscope Camera Head Memory
US10166044B1 (en) 2013-05-31 2019-01-01 Freshwater Bay Industries, Llc Apparatus for repositioning the vagina, cervix, uterus and pelvic floor and method to secure same
US11627865B2 (en) 2013-05-31 2023-04-18 Freshwater Bay Industries, Llc Vaginal surgical apparatus
WO2016201321A1 (en) * 2015-06-11 2016-12-15 Mark Richey Vaginal surgical apparatus
US20160278810A1 (en) 2013-05-31 2016-09-29 Mark Edmund Richey Vaginal surgical apparatus
US11154327B2 (en) 2013-05-31 2021-10-26 Freshwater Bay Industries, Llc Vaginal surgical apparatus
CN103822583A (en) * 2014-03-12 2014-05-28 武汉华中天纬光电系统有限公司 Multimode multifunctional electronic endoscopy probe
CN105100682B (en) * 2014-04-30 2018-12-25 通用电气公司 Borescope with navigation feature
JP6706026B2 (en) * 2015-04-01 2020-06-03 オリンパス株式会社 Endoscope system and operating method of endoscope apparatus
WO2016194018A1 (en) 2015-05-29 2016-12-08 オリンパス株式会社 Illumination device and measurement device
US9936151B2 (en) 2015-10-16 2018-04-03 Capsovision Inc Single image sensor for capturing mixed structured-light images and regular images
US10785428B2 (en) 2015-10-16 2020-09-22 Capsovision Inc. Single image sensor for capturing mixed structured-light images and regular images
US10402992B2 (en) 2015-10-16 2019-09-03 Capsovision Inc. Method and apparatus for endoscope with distance measuring for object scaling
CN108055524A (en) * 2017-12-22 2018-05-18 深圳市金立通信设备有限公司 A kind of structure light module, assemble method and terminal
US10817729B2 (en) * 2018-09-26 2020-10-27 Allstate Insurance Company Dynamic driving metric output generation using computer vision methods
US11815677B1 (en) 2019-05-15 2023-11-14 Apple Inc. Display using scanning-based sequential pupil expansion
US11237332B1 (en) 2019-05-15 2022-02-01 Apple Inc. Direct optical coupling of scanning light engines to a waveguide
US11719947B1 (en) 2019-06-30 2023-08-08 Apple Inc. Prism beam expander
US11290694B1 (en) 2020-03-09 2022-03-29 Apple Inc. Image projector with high dynamic range

Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5069548A (en) * 1990-08-08 1991-12-03 Industrial Technology Institute Field shift moire system
US5135308A (en) * 1990-03-09 1992-08-04 Carl-Zeiss-Stiftung Method and apparatus for non-contact measuring of object surfaces
US5386292A (en) * 1992-05-05 1995-01-31 Kaltenbach & Voight Gmbh & Co. Optical measurement of teeth
US5434669A (en) * 1990-10-23 1995-07-18 Olympus Optical Co., Ltd. Measuring interferometric endoscope having a laser radiation source
US5835212A (en) * 1996-10-18 1998-11-10 Uniphase Telecommunications Products, Inc. Variable chirp optical modulator using single modulation source
US5847832A (en) * 1996-03-15 1998-12-08 Hughes Aircraft Company Moire topographic measurement
US6084712A (en) * 1998-11-03 2000-07-04 Dynamic Measurement And Inspection,Llc Three dimensional imaging using a refractive optic design
US6088105A (en) * 1998-04-04 2000-07-11 Joh. & Ernst Link Gmbh & Co. Kg Measuring unit for determining dimensions of test pieces, preferably of hollow bodies, in particular, of bores of workpieces, and method for measuring such dimensions
US6100984A (en) * 1999-06-11 2000-08-08 Chen; Fang Surface measurement system with a laser light generator
US20030043387A1 (en) * 2000-11-22 2003-03-06 Ssang-Gun Lim Method and apparatus for measuring the three-dimensional shape of an object using a moire equipment
US20050046872A1 (en) * 2003-08-28 2005-03-03 General Electric Company Method and system for image processing for structured light profiling of a part
US20050099638A1 (en) * 2003-09-17 2005-05-12 Mark Quadling High speed multiple line three-dimensional digitization
US20060132790A1 (en) * 2003-02-20 2006-06-22 Applied Science Innovations, Inc. Optical coherence tomography with 3d coherence scanning
US20060282009A1 (en) * 2003-06-13 2006-12-14 Ake Oberg Device for measuring physical properties of the tympanic membrane
US7170677B1 (en) * 2002-01-25 2007-01-30 Everest Vit Stereo-measurement borescope with 3-D viewing
US20070109558A1 (en) * 2005-11-15 2007-05-17 Harding Kevin G Optical edge break gage
US7369253B2 (en) * 2004-10-13 2008-05-06 Akrometrix, Llc Systems and methods for measuring sample surface flatness of continuously moving samples
US20080208006A1 (en) * 2004-09-24 2008-08-28 Mina Farr Opto-electronic illumination and vision module for endoscopy
US7821649B2 (en) * 2008-03-05 2010-10-26 Ge Inspection Technologies, Lp Fringe projection system and method for a probe suitable for phase-shift analysis

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4280008A (en) 1976-12-24 1981-07-21 Basf Aktiengesellschaft Chirally substituted 2-imidazolin-5-ones
JPS59192223A (en) 1983-04-16 1984-10-31 Sumitomo Electric Ind Ltd Stereoscopical image fiber
WO1995004254A1 (en) * 1993-07-29 1995-02-09 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Three-dimensional measurement arrangement for inaccessible cavities
KR19990029064A (en) 1995-07-18 1999-04-15 낸시 엘. 후체슨 Moiré Interference System and Method with Extended Image Depth
US7262797B2 (en) 2001-02-22 2007-08-28 Ge Inspection Technologies Lp Method and system for storing calibration data within image files
JP2005009917A (en) * 2003-06-17 2005-01-13 Mitsutoyo Corp Surface copying measuring instrument, surface copying measuring method, surface copying measuring program, and recording medium
US7652275B2 (en) * 2006-07-28 2010-01-26 Mitutoyo Corporation Non-contact probe control interface
US7508529B2 (en) * 2006-07-31 2009-03-24 Mitutoyo Corporation Multi-range non-contact probe
US7969583B2 (en) 2008-03-05 2011-06-28 General Electric Company System and method to determine an object distance from a reference point to a point on the object surface
US8107083B2 (en) 2008-03-05 2012-01-31 General Electric Company System aspects for a probe system that utilizes structured-light
US7812968B2 (en) 2008-03-05 2010-10-12 Ge Inspection Technologies, Lp Fringe projection system and method for a probe using a coherent fiber bundle
EP2272417B1 (en) 2009-07-10 2016-11-09 GE Inspection Technologies, LP Fringe projection system for a probe suitable for phase-shift analysis

Patent Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5135308A (en) * 1990-03-09 1992-08-04 Carl-Zeiss-Stiftung Method and apparatus for non-contact measuring of object surfaces
US5069548A (en) * 1990-08-08 1991-12-03 Industrial Technology Institute Field shift moire system
US5434669A (en) * 1990-10-23 1995-07-18 Olympus Optical Co., Ltd. Measuring interferometric endoscope having a laser radiation source
US5386292A (en) * 1992-05-05 1995-01-31 Kaltenbach & Voight Gmbh & Co. Optical measurement of teeth
US5847832A (en) * 1996-03-15 1998-12-08 Hughes Aircraft Company Moire topographic measurement
US5835212A (en) * 1996-10-18 1998-11-10 Uniphase Telecommunications Products, Inc. Variable chirp optical modulator using single modulation source
US6088105A (en) * 1998-04-04 2000-07-11 Joh. & Ernst Link Gmbh & Co. Kg Measuring unit for determining dimensions of test pieces, preferably of hollow bodies, in particular, of bores of workpieces, and method for measuring such dimensions
US6084712A (en) * 1998-11-03 2000-07-04 Dynamic Measurement And Inspection,Llc Three dimensional imaging using a refractive optic design
US6100984A (en) * 1999-06-11 2000-08-08 Chen; Fang Surface measurement system with a laser light generator
US20030043387A1 (en) * 2000-11-22 2003-03-06 Ssang-Gun Lim Method and apparatus for measuring the three-dimensional shape of an object using a moire equipment
US7170677B1 (en) * 2002-01-25 2007-01-30 Everest Vit Stereo-measurement borescope with 3-D viewing
US20060132790A1 (en) * 2003-02-20 2006-06-22 Applied Science Innovations, Inc. Optical coherence tomography with 3d coherence scanning
US20060282009A1 (en) * 2003-06-13 2006-12-14 Ake Oberg Device for measuring physical properties of the tympanic membrane
US20050046872A1 (en) * 2003-08-28 2005-03-03 General Electric Company Method and system for image processing for structured light profiling of a part
US20050099638A1 (en) * 2003-09-17 2005-05-12 Mark Quadling High speed multiple line three-dimensional digitization
US20080208006A1 (en) * 2004-09-24 2008-08-28 Mina Farr Opto-electronic illumination and vision module for endoscopy
US7369253B2 (en) * 2004-10-13 2008-05-06 Akrometrix, Llc Systems and methods for measuring sample surface flatness of continuously moving samples
US20070109558A1 (en) * 2005-11-15 2007-05-17 Harding Kevin G Optical edge break gage
US7821649B2 (en) * 2008-03-05 2010-10-26 Ge Inspection Technologies, Lp Fringe projection system and method for a probe suitable for phase-shift analysis

Cited By (144)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11324395B2 (en) 2003-04-01 2022-05-10 Boston Scientific Scimed, Inc. Endoscopic imaging system
US9913573B2 (en) 2003-04-01 2018-03-13 Boston Scientific Scimed, Inc. Endoscopic imaging system
US10765307B2 (en) 2003-04-01 2020-09-08 Boston Scientific Scimed, Inc. Endoscopic imaging system
US10827970B2 (en) 2005-10-14 2020-11-10 Aranz Healthcare Limited Method of monitoring a surface feature and apparatus therefor
US10291850B2 (en) 2006-12-20 2019-05-14 General Electric Company Inspection apparatus method and apparatus comprising selective frame output
US8422030B2 (en) 2008-03-05 2013-04-16 General Electric Company Fringe projection system with intensity modulating by columns of a plurality of grating elements
US7812968B2 (en) * 2008-03-05 2010-10-12 Ge Inspection Technologies, Lp Fringe projection system and method for a probe using a coherent fiber bundle
US20110205552A1 (en) * 2008-03-05 2011-08-25 General Electric Company Fringe projection system for a probe with intensity modulating element suitable for phase-shift analysis
US20090225320A1 (en) * 2008-03-05 2009-09-10 Clark Alexander Bendall Fringe Projection System and Method for a Probe using a Coherent Fiber Bundle
US10140724B2 (en) 2009-01-12 2018-11-27 Intermec Ip Corporation Semi-automatic dimensioning with imager on a portable device
US10845184B2 (en) 2009-01-12 2020-11-24 Intermec Ip Corporation Semi-automatic dimensioning with imager on a portable device
US20110210961A1 (en) * 2010-02-26 2011-09-01 Clark Alexander Bendall Method of determining the profile of a surface of an object
US8760447B2 (en) 2010-02-26 2014-06-24 Ge Inspection Technologies, Lp Method of determining the profile of a surface of an object
CN102906536A (en) * 2010-05-19 2013-01-30 株式会社尼康 Shape measuring device and shape measuring method
EP2573510A4 (en) * 2010-05-19 2016-11-16 Nikon Corp Shape measuring device and shape measuring method
GB2481459B (en) * 2010-06-25 2017-05-03 Fraunhofer-Gesellschaft Zur Forderung Der Angewandten Forschung E V Capturing a surface structure of an object surface
GB2481459A (en) * 2010-06-25 2011-12-28 Fraunhofer Ges Forschung Capturing A Surface Structure Of An Object Surface
US8681217B2 (en) * 2010-07-21 2014-03-25 Olympus Corporation Inspection apparatus and measurement method
US20120019653A1 (en) * 2010-07-21 2012-01-26 Olympus Corporation Inspection apparatus and measurement method
US9277855B2 (en) * 2010-08-10 2016-03-08 Boston Scientific Scimed, Inc. Endoscopic system for enhanced visualization
US11278194B2 (en) 2010-08-10 2022-03-22 Boston Scientific Scimed. Inc. Endoscopic system for enhanced visualization
US20120041267A1 (en) * 2010-08-10 2012-02-16 Christopher Benning Endoscopic system for enhanced visualization
US8704890B2 (en) * 2010-08-19 2014-04-22 Olympus Corporation Inspection apparatus and measuring method
US10010268B2 (en) 2010-09-15 2018-07-03 Olympus Corporation Endoscope apparatus
FR2965388A1 (en) * 2010-09-29 2012-03-30 Tokendo Method for estimating wear and dimensions of defects of mechanical element i.e. engine of e.g. helicopter, involves introducing digitized model into image processing device to acquire video-endoscopic image of inspected mechanical element
US10157495B2 (en) 2011-03-04 2018-12-18 General Electric Company Method and device for displaying a two-dimensional image of a viewed object simultaneously with an image depicting the three-dimensional geometry of the viewed object
US10019812B2 (en) 2011-03-04 2018-07-10 General Electric Company Graphic overlay for measuring dimensions of features using a video inspection device
US9013469B2 (en) 2011-03-04 2015-04-21 General Electric Company Method and device for displaying a three-dimensional view of the surface of a viewed object
US9984474B2 (en) 2011-03-04 2018-05-29 General Electric Company Method and device for measuring features on or near an object
US10586341B2 (en) 2011-03-04 2020-03-10 General Electric Company Method and device for measuring features on or near an object
US8411083B2 (en) 2011-04-06 2013-04-02 General Electric Company Method and device for displaying an indication of the quality of the three-dimensional data for a surface of a viewed object
EP2689708A1 (en) * 2011-04-27 2014-01-29 Olympus Corporation Endoscopic apparatus and measurement method
US20190274591A1 (en) * 2011-04-27 2019-09-12 Olympus Corporation Endoscope apparatus and measuring method
US10898110B2 (en) * 2011-04-27 2021-01-26 Olympus Corporation Endoscope apparatus and measuring method
US10342459B2 (en) 2011-04-27 2019-07-09 Olympus Corporation Endoscope apparatus and measuring method
EP2689708A4 (en) * 2011-04-27 2014-06-18 Olympus Corp Endoscopic apparatus and measurement method
WO2012155237A1 (en) * 2011-05-16 2012-11-22 National Research Council Of Canada High resolution high contrast edge projection
US8754954B2 (en) 2011-05-16 2014-06-17 National Research Council Of Canada High resolution high contrast edge projection
US9622644B2 (en) 2011-05-24 2017-04-18 Olympus Corporation Endoscope
US9581802B2 (en) 2011-05-24 2017-02-28 Olympus Corporation Endoscope device, and measurement method
US10368721B2 (en) 2011-05-24 2019-08-06 Olympus Corporation Endoscope
US8976249B2 (en) 2011-11-04 2015-03-10 Empire Technology Development Llc IR signal capture for images
US9398288B2 (en) 2011-11-04 2016-07-19 Empire Technology Development Llc IR signal capture for images
US9861285B2 (en) 2011-11-28 2018-01-09 Aranz Healthcare Limited Handheld skin measuring or monitoring device
US11850025B2 (en) 2011-11-28 2023-12-26 Aranz Healthcare Limited Handheld skin measuring or monitoring device
US10874302B2 (en) 2011-11-28 2020-12-29 Aranz Healthcare Limited Handheld skin measuring or monitoring device
US9179844B2 (en) 2011-11-28 2015-11-10 Aranz Healthcare Limited Handheld skin measuring or monitoring device
US20130287288A1 (en) * 2012-04-25 2013-10-31 General Electric Company Method and device for determining the offset distance between two surfaces
US9779546B2 (en) 2012-05-04 2017-10-03 Intermec Ip Corp. Volume dimensioning systems and methods
US10467806B2 (en) 2012-05-04 2019-11-05 Intermec Ip Corp. Volume dimensioning systems and methods
US10007858B2 (en) 2012-05-15 2018-06-26 Honeywell International Inc. Terminals and methods for dimensioning objects
US10635922B2 (en) 2012-05-15 2020-04-28 Hand Held Products, Inc. Terminals and methods for dimensioning objects
US10321127B2 (en) 2012-08-20 2019-06-11 Intermec Ip Corp. Volume dimensioning system calibration systems and methods
US10805603B2 (en) 2012-08-20 2020-10-13 Intermec Ip Corp. Volume dimensioning system calibration systems and methods
US9939259B2 (en) 2012-10-04 2018-04-10 Hand Held Products, Inc. Measuring object dimensions using mobile computer
US10908013B2 (en) 2012-10-16 2021-02-02 Hand Held Products, Inc. Dimensioning system
US9841311B2 (en) 2012-10-16 2017-12-12 Hand Held Products, Inc. Dimensioning system
CN103868471A (en) * 2012-12-12 2014-06-18 佳能株式会社 Three-dimensional shape measuring apparatus and control method thereof
US10066934B2 (en) 2012-12-12 2018-09-04 Canon Kabushiki Kaisha Three-dimensional shape measuring apparatus and control method thereof
EP2743636A1 (en) * 2012-12-12 2014-06-18 Canon Kabushiki Kaisha Three-dimensional shape measuring apparatus and control method thereof
US20140207403A1 (en) * 2013-01-22 2014-07-24 General Electric Company Inspection instrument auto-configuration
US20140275764A1 (en) * 2013-03-13 2014-09-18 John T. SHEN System for obtaining clear endoscope images
US9784566B2 (en) 2013-03-13 2017-10-10 Intermec Ip Corp. Systems and methods for enhancing dimensioning
US11013398B2 (en) * 2013-03-13 2021-05-25 Stryker Corporation System for obtaining clear endoscope images
US20160025653A1 (en) * 2013-03-15 2016-01-28 Vidtek Associates NV, Inc. Borescope apparatus and a method of using same
US9692954B2 (en) * 2013-03-28 2017-06-27 General Electric Company Methods and devices for adjusting brightness of a light source
US20140293038A1 (en) * 2013-03-28 2014-10-02 General Electric Company Methods and devices for adjusting brightness of a light source
US20140333778A1 (en) * 2013-05-13 2014-11-13 General Electric Company Automated borescope measurement tip accuracy test
US9412189B2 (en) 2013-05-13 2016-08-09 General Electric Company Method and system for detecting known measurable object features
US9074868B2 (en) * 2013-05-13 2015-07-07 General Electric Company Automated borescope measurement tip accuracy test
US10203402B2 (en) 2013-06-07 2019-02-12 Hand Held Products, Inc. Method of error correction for 3D imaging device
US10228452B2 (en) 2013-06-07 2019-03-12 Hand Held Products, Inc. Method of error correction for 3D imaging device
US9464885B2 (en) 2013-08-30 2016-10-11 Hand Held Products, Inc. System and method for package dimensioning
US10699149B2 (en) 2013-12-17 2020-06-30 General Electric Company Method and device for automatically identifying a point of interest in a depth measurement on a viewed object
US9600928B2 (en) 2013-12-17 2017-03-21 General Electric Company Method and device for automatically identifying a point of interest on the surface of an anomaly
US9875574B2 (en) 2013-12-17 2018-01-23 General Electric Company Method and device for automatically identifying the deepest point on the surface of an anomaly
US10217016B2 (en) 2013-12-17 2019-02-26 General Electric Company Method and device for automatically identifying a point of interest in a depth measurement on a viewed object
US9818039B2 (en) 2013-12-17 2017-11-14 General Electric Company Method and device for automatically identifying a point of interest in a depth measurement on a viewed object
US9842430B2 (en) 2013-12-17 2017-12-12 General Electric Company Method and device for automatically identifying a point of interest on a viewed object
US9633426B2 (en) * 2014-05-30 2017-04-25 General Electric Company Remote visual inspection image capture system and method
US20150348253A1 (en) * 2014-05-30 2015-12-03 General Electric Company Remote visual inspection image capture system and method
US10240914B2 (en) 2014-08-06 2019-03-26 Hand Held Products, Inc. Dimensioning system with guided alignment
US9823059B2 (en) 2014-08-06 2017-11-21 Hand Held Products, Inc. Dimensioning system with guided alignment
US10121039B2 (en) 2014-10-10 2018-11-06 Hand Held Products, Inc. Depth sensor based auto-focus system for an indicia scanner
US10402956B2 (en) 2014-10-10 2019-09-03 Hand Held Products, Inc. Image-stitching for dimensioning
US10134120B2 (en) 2014-10-10 2018-11-20 Hand Held Products, Inc. Image-stitching for dimensioning
US9779276B2 (en) 2014-10-10 2017-10-03 Hand Held Products, Inc. Depth sensor based auto-focus system for an indicia scanner
US10859375B2 (en) 2014-10-10 2020-12-08 Hand Held Products, Inc. Methods for improving the accuracy of dimensioning-system measurements
US10810715B2 (en) 2014-10-10 2020-10-20 Hand Held Products, Inc System and method for picking validation
US10775165B2 (en) 2014-10-10 2020-09-15 Hand Held Products, Inc. Methods for improving the accuracy of dimensioning-system measurements
US10060729B2 (en) 2014-10-21 2018-08-28 Hand Held Products, Inc. Handheld dimensioner with data-quality indication
US10218964B2 (en) 2014-10-21 2019-02-26 Hand Held Products, Inc. Dimensioning system with feedback
US9762793B2 (en) 2014-10-21 2017-09-12 Hand Held Products, Inc. System and method for dimensioning
US9752864B2 (en) 2014-10-21 2017-09-05 Hand Held Products, Inc. Handheld dimensioning system with feedback
EP3012579A1 (en) * 2014-10-21 2016-04-27 Hand Held Products, Inc. System and method for dimensioning
US9557166B2 (en) 2014-10-21 2017-01-31 Hand Held Products, Inc. Dimensioning system with multipath interference mitigation
US10393508B2 (en) 2014-10-21 2019-08-27 Hand Held Products, Inc. Handheld dimensioning system with measurement-conformance feedback
US9897434B2 (en) 2014-10-21 2018-02-20 Hand Held Products, Inc. Handheld dimensioning system with measurement-conformance feedback
EP3274653A4 (en) * 2015-03-22 2018-11-14 Facebook Inc. Depth mapping with a head mounted display using stereo cameras and structured light
US11403887B2 (en) 2015-05-19 2022-08-02 Hand Held Products, Inc. Evaluating image values
US11906280B2 (en) 2015-05-19 2024-02-20 Hand Held Products, Inc. Evaluating image values
US9786101B2 (en) 2015-05-19 2017-10-10 Hand Held Products, Inc. Evaluating image values
US10593130B2 (en) 2015-05-19 2020-03-17 Hand Held Products, Inc. Evaluating image values
US10066982B2 (en) 2015-06-16 2018-09-04 Hand Held Products, Inc. Calibrating a volume dimensioner
US10247547B2 (en) 2015-06-23 2019-04-02 Hand Held Products, Inc. Optical pattern projector
US9857167B2 (en) 2015-06-23 2018-01-02 Hand Held Products, Inc. Dual-projector three-dimensional scanner
US9835486B2 (en) 2015-07-07 2017-12-05 Hand Held Products, Inc. Mobile dimensioner apparatus for use in commerce
US10612958B2 (en) 2015-07-07 2020-04-07 Hand Held Products, Inc. Mobile dimensioner apparatus to mitigate unfair charging practices in commerce
US10393506B2 (en) 2015-07-15 2019-08-27 Hand Held Products, Inc. Method for a mobile dimensioning device to use a dynamic accuracy compatible with NIST standard
US11353319B2 (en) 2015-07-15 2022-06-07 Hand Held Products, Inc. Method for a mobile dimensioning device to use a dynamic accuracy compatible with NIST standard
US11029762B2 (en) 2015-07-16 2021-06-08 Hand Held Products, Inc. Adjusting dimensioning results using augmented reality
US10094650B2 (en) 2015-07-16 2018-10-09 Hand Held Products, Inc. Dimensioning and imaging items
US20170116462A1 (en) * 2015-10-22 2017-04-27 Canon Kabushiki Kaisha Measurement apparatus and method, program, article manufacturing method, calibration mark member, processing apparatus, and processing system
JP2017083419A (en) * 2015-10-22 2017-05-18 キヤノン株式会社 Measurement device and method, article manufacturing method, calibration mark member, processing device, and processing system
US10249030B2 (en) 2015-10-30 2019-04-02 Hand Held Products, Inc. Image transformation for indicia reading
US10225544B2 (en) 2015-11-19 2019-03-05 Hand Held Products, Inc. High resolution dot pattern
US9703005B2 (en) * 2015-11-30 2017-07-11 Jp3 Measurement, Llc Downhole sensing via swept source lasers
US10025314B2 (en) 2016-01-27 2018-07-17 Hand Held Products, Inc. Vehicle positioning and object avoidance
US10747227B2 (en) 2016-01-27 2020-08-18 Hand Held Products, Inc. Vehicle positioning and object avoidance
US10777317B2 (en) 2016-05-02 2020-09-15 Aranz Healthcare Limited Automatically assessing an anatomical surface feature and securely managing information related to the same
US11923073B2 (en) 2016-05-02 2024-03-05 Aranz Healthcare Limited Automatically assessing an anatomical surface feature and securely managing information related to the same
US11250945B2 (en) 2016-05-02 2022-02-15 Aranz Healthcare Limited Automatically assessing an anatomical surface feature and securely managing information related to the same
US10339352B2 (en) 2016-06-03 2019-07-02 Hand Held Products, Inc. Wearable metrological apparatus
US10872214B2 (en) 2016-06-03 2020-12-22 Hand Held Products, Inc. Wearable metrological apparatus
US9940721B2 (en) 2016-06-10 2018-04-10 Hand Held Products, Inc. Scene change detection in a dimensioner
US10417769B2 (en) 2016-06-15 2019-09-17 Hand Held Products, Inc. Automatic mode switching in a volume dimensioner
US10163216B2 (en) 2016-06-15 2018-12-25 Hand Held Products, Inc. Automatic mode switching in a volume dimensioner
US20170366773A1 (en) * 2016-06-21 2017-12-21 Siemens Aktiengesellschaft Projection in endoscopic medical imaging
US20180045510A1 (en) * 2016-08-12 2018-02-15 General Electric Company Probe System and Method
US11125551B2 (en) * 2016-08-12 2021-09-21 Baker Hughes, A Ge Company, Llc Light modulation for inspection probes
US11116407B2 (en) 2016-11-17 2021-09-14 Aranz Healthcare Limited Anatomical surface assessment methods, devices and systems
US10909708B2 (en) 2016-12-09 2021-02-02 Hand Held Products, Inc. Calibrating a dimensioner using ratios of measurable parameters of optic ally-perceptible geometric elements
US11047672B2 (en) 2017-03-28 2021-06-29 Hand Held Products, Inc. System for optically dimensioning
US11903723B2 (en) 2017-04-04 2024-02-20 Aranz Healthcare Limited Anatomical surface assessment methods, devices and systems
US10733748B2 (en) 2017-07-24 2020-08-04 Hand Held Products, Inc. Dual-pattern optical 3D dimensioning
US10584962B2 (en) 2018-05-01 2020-03-10 Hand Held Products, Inc System and method for validating physical-item security
US11340257B2 (en) * 2019-01-24 2022-05-24 Rohde & Schwarz Gmbh & Co. Kg Probe, measuring system and method for applying a probe
US20200241045A1 (en) * 2019-01-24 2020-07-30 Rohde & Schwarz Gmbh & Co. Kg Probe, measuring system and method for applying a probe
US11789038B2 (en) 2019-01-24 2023-10-17 Rohde & Schwarz Gmbh & Co. Kg Probe, measuring system and method for applying a probe
US11639846B2 (en) 2019-09-27 2023-05-02 Honeywell International Inc. Dual-pattern optical 3D dimensioning
US11549805B2 (en) * 2019-12-09 2023-01-10 Industrial Technology Research Institute Projecting apparatus and projecting calibration method
US20210172732A1 (en) * 2019-12-09 2021-06-10 Industrial Technology Research Institute Projecting apparatus and projecting calibration method
WO2021256953A1 (en) * 2020-06-18 2021-12-23 Общество С Ограниченной Ответственностью "Турбоскан" Method and device for monitoring the shape of hard-to-reach components
CN116105632A (en) * 2023-04-12 2023-05-12 四川大学 Self-supervision phase unwrapping method and device for structured light three-dimensional imaging

Also Published As

Publication number Publication date
CN101726263A (en) 2010-06-09
EP2175231A1 (en) 2010-04-14
EP2175231B1 (en) 2014-06-18
US20120188560A1 (en) 2012-07-26
CN101726263B (en) 2012-07-18
US8976363B2 (en) 2015-03-10
US8107083B2 (en) 2012-01-31

Similar Documents

Publication Publication Date Title
US8976363B2 (en) System aspects for a probe system that utilizes structured-light
US8422030B2 (en) Fringe projection system with intensity modulating by columns of a plurality of grating elements
US7821649B2 (en) Fringe projection system and method for a probe suitable for phase-shift analysis
CN106416225B (en) Remote visual inspection image capture system and method
RU2560996C2 (en) Measuring method based on structuring light
US10323933B2 (en) Optical three-dimensional shape measuring device
EP2272417B1 (en) Fringe projection system for a probe suitable for phase-shift analysis
US10415958B2 (en) Measuring device
US20070091183A1 (en) Method and apparatus for adapting the operation of a remote viewing device to correct optical misalignment
CN102450997A (en) Endoscopic device
JP6839089B2 (en) Endoscope device, how to operate the endoscope device, and recording medium
EP2520217B1 (en) Fringe projection system for a probe with intensity modulating element suitable for phase-shift analysis, and intensity modulating element
US11125551B2 (en) Light modulation for inspection probes
JPWO2019203006A1 (en) Endoscope device, endoscope processor device and endoscope image display method
WO2019225691A1 (en) Endoscope image processing device and endoscope system
KR101652927B1 (en) Method for displaying image, image pickup system and endoscope apparatus including the same
JP6991600B2 (en) Image measurement system, image measurement method, image measurement program and recording medium
WO2022113934A1 (en) Surface roughness measuring device, and surface roughness measuring method
JP4871403B2 (en) 3D image scanner
US11743596B1 (en) Adaptive brightness non-uniformity correction in endoscope visualization
JP2009014494A (en) Measuring device
JP5211703B2 (en) projector
JP4758773B2 (en) 3D image scanner
KR20150002990A (en) Portable luminance meter
JPH0888791A (en) Image pickup device

Legal Events

Date Code Title Description
AS Assignment

Owner name: GENERAL ELECTRIC COMPANY, NEW YORK

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BENDALL, CLARK ALEXANDER;HARDING, KEVIN GEORGE;KARPEN, THOMAS;AND OTHERS;REEL/FRAME:022392/0482;SIGNING DATES FROM 20081031 TO 20081203

Owner name: GENERAL ELECTRIC COMPANY, NEW YORK

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BENDALL, CLARK ALEXANDER;HARDING, KEVIN GEORGE;KARPEN, THOMAS;AND OTHERS;SIGNING DATES FROM 20081031 TO 20081203;REEL/FRAME:022392/0482

STCF Information on status: patent grant

Free format text: PATENTED CASE

CC Certificate of correction
FPAY Fee payment

Year of fee payment: 4

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8

AS Assignment

Owner name: BAKER HUGHES OILFIELD OPERATIONS, LLC, TEXAS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:GENERAL ELECTRIC COMPANY;REEL/FRAME:056428/0609

Effective date: 20170703

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 12TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1553); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 12