US5491757A - Field tester gaze tracking using content addressable memories to improve image data analysis speed - Google Patents
Field tester gaze tracking using content addressable memories to improve image data analysis speed Download PDFInfo
- Publication number
- US5491757A US5491757A US08/172,136 US17213693A US5491757A US 5491757 A US5491757 A US 5491757A US 17213693 A US17213693 A US 17213693A US 5491757 A US5491757 A US 5491757A
- Authority
- US
- United States
- Prior art keywords
- address
- image
- video image
- video
- content addressable
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Lifetime
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/0083—Apparatus for testing the eyes; Instruments for examining the eyes provided with means for patient positioning
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/02—Subjective types, i.e. testing apparatus requiring the active assistance of the patient
- A61B3/024—Subjective types, i.e. testing apparatus requiring the active assistance of the patient for determining the visual field, e.g. perimeter types
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/113—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/193—Preprocessing; Feature extraction
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F9/00—Methods or devices for treatment of the eyes; Devices for putting-in contact lenses; Devices to correct squinting; Apparatus to guide the blind; Protective devices for the eyes, carried on the body or in the hand
- A61F2009/0035—Devices for immobilising a patient's head with respect to the instrument
Definitions
- This invention relates to a field analyzer for testing the optical perception of the retina of the human eye.
- the disclosed method and apparatus includes a moveable chin cup to maintain eye centration on any test frame utilized during the test and includes techniques for rapidly computing gaze direction, including the use of content addressable memories.
- a field analyzer is a device for surveying the sensitivity of a patient's retina.
- a spot of light termed a point
- a patient viewing the hemispherical projection screen from the center of the sphere fixates along a line of sight to a fixation light source mounted on the surface of the bowl.
- the point of projection on the hemispherical projection screen controllably changes to positions spaced apart from the fixation light source.
- the point is varied in intensity as the point moves from position to position on the hemispherical projection screen.
- a subjective determination is made by the patient by depressing a response button FIG. 1A item 30, if the point is seen.
- This simple concept has two basic optical problems interfacing to the patient.
- the patient must fixate on the center of the hemispheric projection screen. This fixation must be maintained when the point is presented, usually to the side of the patient's fixated line of sight, if the point is to fall on a consistent part of the retina.
- the patient's vision usually must be properly corrected to focus the surface of the hemispherical projection screen onto the retina.
- focus is particularly critical when the sensitivity of the retina is measured at the threshold of the patient's vision perception; were the patient's focus not correct, targets that should be seen are not detected and give erroneous results. This is due to the fact that an unfocused spot of light appears dimmer than a focused one.
- the patient's eyeglasses are almost always unsuitable for providing a focused view of the points on the hemispherical projection screen for at least three reasons.
- the frames of the patient's glasses will vary unpredictably in size and shape. They are an unknown in the areas of obscuration of vision and lens tilt angle.
- the optical prescription within the patient's glasses is almost always deficient for the particular focal distance (usually about 30 centimeters) required for the test.
- the glasses almost always do not correct the patient's vision to the distance from the patient's eye to the surface of the screen.
- the viewing angle of the patient's glasses is usually deficient.
- the glasses of the patient may contain bifocal lenses or variable lenses which change the focal distance of the patient as a function of the point position on the screen. Where testing of the field of vision of a patient is being made, such glasses give erroneous results.
- vision during a field test is typically corrected by so-called trial lenses which are selected to provide vision corrected to the 30 centimeter focal distance and placed near the eye in a trial lens holder.
- two lenses are usually required, one to correct spherical power and one to correct cylinder (astigmatic) power.
- the correction of the patient's eyesight is accomplished by adding one or two trial lenses to the optical path, directly in front of the patient's eye.
- These usually round lenses are made in a variety of sphere and cylinder powers and are selected by the operator based upon the patient's prescription, corrected to 30 centimeters, the radius of the hemispherical projection screen.
- the standard trial lenses are relatively small in diameter (on the order of 3.5 cm).
- the center of the trial lenses should be placed in the approximate center of the eye to avoid prismatic effects associated with strong lenses. Additionally, the trial lenses should be close to the eye, to prevent the obscuring of the patient's vision by the trial lens holder or lens frame. Most field testing is done within a 30 degree angle from the fixation axis. Closeness is even more important when strong positive lenses are used as they make the viewing angle through the lenses smaller by magnifying the bowl.
- Field analyzers typically use the ambient screen light for illumination of the video field.
- the ambient screen light of most field testers comes from the uniform illumination of the hemispherical projection screen surface, this illumination being provided to give uniform contrast to the projected points. It is also known to illuminate the eye from lights mounted on the trial lens holder using infra-red wavelengths to prevent the patient from detecting the lights.
- Mapping the recognized variably positioned points on the spherical projection screen accurately onto corresponding positions on the retina requires that the eye does not change its angular relationship to the center of the hemispherical projection screen as the test progresses.
- the eye is disposed in the head in such a way that changing gaze direction is easily accomplished, and in fact is the most natural thing to do when an object--such as a dim spot of light--comes into peripheral view. It therefore requires a great amount of concentration on the part of the patient to maintain a constant gaze direction. In short, the test procedure consuming normally up to 20 minutes for each eye can be very tiring on the patient.
- Field analyzers are known that illuminate the hemispherical projection screen with an even field of light generated by incandescent lamps which contain some infra-red energy.
- the video camera used is sensitive in the infrared spectrum. This increases the contrast for patients with a dark colored iris between light reflected from the iris and the dark pupil, as all iris colors reflect about the same amount of light with infrared illumination.
- this illumination system also reflects light from the trial lens surface.
- the hemispherical projection screen partly surrounds the lens.
- the lens is typically not anti-reflection coated. Therefore the lens glows with infrared light captured from the hemispherical projection screen. This glow from the lens reduces the pupil to iris contrast in the video image.
- a field test apparatus and method in which gradual movement of the head supporting chin cup is used to maintain a centered relation between the eye being tested and the trial lens frame holding the required prescription for optimum vision of the patient.
- alternating illumination is provided in the infrared between a central reflex on the cornea and an overall eye illuminating source, which generates a high contrast bright circular iris surrounding a central dark pupil.
- content addressable memories are used to generate pointers to locations of specific brightness values for rapid microprocessor analysis of the video data stored in a conventional random access memory (RAM).
- Both the corneal reflection position--which is a brightly illuminated dot on an otherwise dark background--and the boundary between the iris and the pupil--which is a dark central pupil surrounded by a brightly illuminated iris, are approximately located using content addressable memory. Thereafter, the examination of RAM data, representing the video image in digital form, is restricted to the locations pointed to by the content addressed memories (CAM). For location of the pupil center, special techniques are disclosed for examining the first and second derivatives of the iris to pupil boundary data to establish within sub-pixel limits the location of said boundaries. These methods allow sufficient data analysis speed improvements to enable the time shared use of a microprocessor, where gaze tracking is done when other control functions required to execute the field test are inactive.
- FIGS. 1A and 1B are respective side elevation and front elevation schematics of a field test apparatus incorporating the moving chin rest of this invention within the general operating environment of a field test apparatus;
- FIG. 2 is a block diagram illustrating a typical required time sharing cycle or loop for the gaze tracking function of this invention specifically illustrating the narrow time window available for the automated observation of gaze tracking information;
- FIG. 3 is a representation of two video frames with each frame consisting of two interleaved fields, this diagram being useful for understanding the gathering of the raw data and the time available for the time shared gaze tracking computation;
- FIG. 4 is a video memory block diagram illustrating the interconnection required to form a memory system for sorting a digital representation of the video data in RAM and also in content addressable memories;
- FIG. 5A is a front elevation view at the eye of the patient being field tested illustrating the measurement of an arbitrary pupil chord and normal chord bisector for locating the center of the pupil;
- FIG. 5B is a view of the lens holder illustrating the light sources on the lens holder
- FIG. 6 is a representation of data within a content addressable memory after the presentation of the video data of a corneal reflection
- FIG. 7 is a representation of data within a content addressable memory after the presentation of the video data of a dark pupil surrounded by an illuminated iris;
- FIG. 8 is a composite diagram representing the video data from a single horizontal video scan line corresponding to the arbitrary chord according to FIG. 5.
- the image data with the first and second derivative of the image data is presented; and,
- FIG. 9 is a composite diagram representing the video data from a single horizontal video scan line corresponding to the arbitrary chord according to FIG. 5, said image containing deliberate artifacts common to image processing.
- the image data with the first and second derivative of the image data is presented.
- FIGS. 1A and 1B a schematic of movable trial lens holder 40 of this invention is illustrated.
- a patient P is illustrated observing a hemispherical projection screen S.
- Patient P is here illustrated having left eye E being tested. In this test the patient P has been directed to fixate on the fixation light L at the center of the hemispheric projection screen.
- the chin rest 25 illustrated has two indentations, these indentations including indentation 26 for testing the patient's right eye and indentation 27 for testing the patient's left eye.
- Projector 14 under the control of a computer (not shown) well known and understood in the prior art, projects spot 16 of the light on the surface of the hemispherical projection screen. The patient indicates that the spot 16 of light was seen by depressing response button 30. The response of the patient in pressing the button is recorded and mapped by apparatus well known and understood in the prior art.
- the field test apparatus illustrated is old. It may be purchased from Allergan Humphrey (now Humphrey Instruments, Inc.) of San Leandro, Calif., USA under the designation Field Analyzer Series 600.
- the method of image generation will be reviewed. Thereafter, the methods and embodiments containing the content addressable memory will be set forth. This will respectively set forth the time constraints imposed by time sharing of the processor, set forth the sequential frames and interleaved fields utilized for analysis and data collection, and illustrate arbitrary scans of eye chords with the known method of the computation of the pupil center.
- Trial lens holder 40 is shown in FIGS. 1B and 5B as a semicircular frame. Holder 40 has an active and inactive position. In the active position, trial lens holder 40 imparts optical prescription to the central 30° of vision of the patient. When the trial lens holder 40 is in the inactive position, the holder is moved out of the central position to an extreme position shown in broken lines at 40' where the trial lens holder is not in the field of view of the patient P during testing.
- field testers are used for two types of field test. The most frequently done field test tests the central 30 degrees from the fixation axis. The less frequent test makes measurements of visual sensitivity at viewing angles between 30 and 90 degrees from the fixation axis, to test the sensitivity of the peripheral vision. For this kind of testing, lens holder 40 is moved to the position of lens holder 40' shown in broken lines. Typically in this extended field of vision testing no trial lenses are utilized. Otherwise the points presented to extreme angles on the screen would not pass through the viewing angle of the lens. Some of the points would not be corrected by the lens and some would be obscured by the trial lens frame.
- More normal field testing consists of measuring the central vision sensitivity within a 30 degree angle from the fixation axis. It will hereafter be assumed that this measurement is the measurement of interest unless specifically otherwise stated.
- FIG. 1B a mechanical schematic is illustrated setting forth the mechanism for the required movement of chin cup 25.
- the mechanical schematic shows the X motor with the body of said motor connected to the chassis of the field tester.
- the shaft 42 extending from the X motor contains a fine external thread.
- the shaft passes through the X motor which contains a mechanism such as a ball screw which is rotated by the rotor of the X motor. Since the shaft is prevented from the rotating by the mechanism, the rotation of the nut causes translation of the Y motor responsive to rotation of the X motor rotor. As the X motor rotor rotates, the shaft 42 moves the Y motor horizontally.
- the Y motor is of similar design and is mounted on a horizontally sliding carriage 50 driven by the X motor.
- the Y motor is capable of moving the chin cup vertically via vertical shaft 44.
- the illustrated method of vertical movement is precisely analogous to the similar horizontal movement of the X motor.
- the illustrated mechanism typically uses conventional linear stepper motors. These stepper motors allow the controlling computer system described below, to move the chin cup 25 to any position necessary in a vertical plane in front of the eye to account for the differences in physiognomy among patients.
- movement of the chin cup is incremental with the movement being less than that required for complete correction of eye centering. This less than full motion is to provide a persuasive movement to the chin cup of which the patient is generally unaware and otherwise not distracted from the test.
- the chin cup correction speed is slow, adding a correction step after each point is presented and gaze measurement made, to help maintain the patient's average head (eye) position, rather than moving rapidly, as in a true correction servo.
- the chin cup and therefore the head and the eye, is moved until the corneal reflex is centered in the video window. Centering is as previous disclosed in Lehmer et al. U.S. Pat. No. 5,220,361 issued Jun. 15, 1993 entitled Gaze Tracking for Field Analyzer. Alternately, another acceptable protocol would be to center the pupil in the video window.
- the amount of chin cup motion for each correcting step is to move in the direction that would place the eye in the center of the lens, but to move only a fraction of the maximum amount. This will result in small changes and not make the total correction in one large movement.
- an additional object of this invention is to determine the actual gaze direction of the eye under field test. This direction is best measured by using the relative position of the center of the pupillary opening 60 and a corneal reflection 140 produced by an infrared source 130 (or 130'; see 1A) on the surface of the hemispherical projection screen S, near the center. For example, if the eye E changes gaze direction slightly, the corneal reflection 140 of source 130 will move at a different rate than the pupil 60. This is due to the fact that the cornea is a portion of a sphere, smaller in diameter than the eye, mounted on the eye.
- the eye rotates about its center which is not the center of the spherical cornea.
- gaze direction can be readily derived.
- An advantage of the disclosed method for determination of the fixation of the eye is that since the absolute eye position with respect to the bowl is known by the chin cup positioning protocol, that part of the measured fixation change due solely to eye change in position may be calculated and subtracted from the measurement. This independence allows the eye to be moved away from the center of the spherical projection screen S.
- FIG. 5B a rear elevation of the side of the lens holder 40 exposed to the patient P is illustrated. It includes two eye illuminating light sources 55 and 57. A typical trial lens 51 with frame 52 is shown within lens holder 40.
- Reflections 65 and 67 are created by light sources 55 and 57. More importantly, it will be understood that since sources 55 and 57 are off center with respect to video camera V, a so-called dark pupil illumination of the eye will result. This dark pupil illumination will brightly illuminate the iris while leaving the pupil dark.
- the corneal reflection 140 is generated either by infrared light source 130 or preferably 130'. (See FIGS. 1A and 1B).
- corneal reflection illumination occurs first and the pupil illumination occurs second.
- illumination of the eye in the infrared is supplied only by light sources 130 or 130'.
- illumination of the pupil occurs from light sources 55 and 57 on lens holder 40.
- FIG. 2 a typical time sharing loop is shown.
- the processor is not idle, as at 200; said processor is involved with moving motors for the adjustment of chin cup 27 or generation of an additional spot 16 (see FIGS. 1A and 1B); or reporting data; else the processor will be in a gaze tracking mode.
- This gaze tracking mode will require that the image with the corneal reflection of light source 130 be compared to the image with a dark pupil (See FIG. 5A).
- the window XX located in the approximate center of the image represents the area covered by the video memory used in gaze tracking.
- the first Frame 1 has two fields (1 and 2) containing video data created when the corneal reflection illumination was present. This bright dot of light will appear on the cornea approximately centered in the video frame clear of lens holder 40 as illustrated at Frame 1, Field 1. It will be noted that the lens holder 40 is indicated in broken lines; it will not appear in the gaze tracking data since it is outside of the window XX.
- Frame 2 also has two fields containing video data created when the pupil illumination was present, the bright iris with a dark pupil of eye E.
- the size of the video memory to be reduced for gaze tracking, active in the window XX covering a small area inside the trial lens. This excludes the trial lens holder from the video data.
- Another advantage of the small window is that it can be converted to a digital form in about one fourth of the available time of the camera field, allowing the time remaining for computation. Given the fact that there is only one video memory, the computation of the corneal reflex must be done in the time between the end of the window and the beginning of the next field, at which time the corneal reflex data may be over-written by the dark pupil data. It is for this reason that the content addressable memory protocol of this invention has been developed.
- the video RAM Random Access Memory
- the video RAM used to store an image of the eye for gaze measurement, contains a digital image, organized as an array.
- the RAM dimensions are 128 dots by 128 lines (16k) .
- the data is stored only in a small window, positioned inside the trial lens. There is a byte for each pixel, where the value of the byte represents the brightness value of the pixel.
- the former requires the centroid of a bright event to be found and the later requires the edges of a dark area to be found on each video line.
- the corneal reflection and the pupil edges are found using two consecutive video frames, one frame with just the reflection from the cornea (see above) and one frame with an illuminated iris and a dark pupil and no central corneal reflections.
- RAMs 500 and 501 can store the video data in a conventional manner.
- the address VID - ADD - (12:0) is generated by addressing logic (not shown) which has two modes, a sequential address, changing at high speed, to capture the video image, or an address connected to the microprocessor address system, to read the resultant stored data.
- the address is shown as two sets, the dot address (7 bits for 128 dots) and the line address (6 bits for 64 lines).
- the function of item 504 is to complement the dot address when READ - REVERSE is true, to make the dot address appear to reverse, allowing RAM 500 to be read back in reverse, as required to find the left hand edge.
- the data to/from RAM 500 can be selected either from VID - DATA - (7:0), the digital representation of the video data, via switch 505, or from the microprocessor data port via switch 506.
- the former is used to save the video image and the later is used to read the resultant data in the RAM.
- RAM 501 is similar, using switches 507 and 508. In practice, RAM 500 saves the first video field and RAM 501 saves the second field.
- the corneal reflection data must be processed between frames, during time period YY, since the RAMs are filled with pupil data on the next two fields. The time to process the pupil data before the shutter closes is ZZ.
- RAM 502 is used as a CAM when loading video data and as a RAM when the resultant data is to be reviewed by the microprocessor.
- switch 509 and 512 are closed, allowing the five least order address bits to be controlled by the five high order video data bits. This causes the CAM to have 32 bins.
- the six higher order address bits are connected to the line address, making an array of video intensity level, the horizontal axis, by video line, the vertical axis.
- Switch 512 places the 7 bit dot address onto the data input such that the CAM cell will contain the dot number valid at the time the cell is addressed and written.
- switches 510 and 511 are closed, the former completes the RAM address structure, and the later places the RAM data on the microprocessor data port.
- RAM 503 is similar to RAM 502 above, except the video data input is connected directly to the data output from RAM 500, via path 517. This path is used while RAM 500 is being read backwards during the second pupil field, to find the left edge.
- switches 513 and 515 are closed.
- switches 514 and 516 are closed.
- each CAM contain four pages, or sections, selected by the control line CAM - PAGE - (1:0). This allows a cleared CAM to be selected between fields, the first storing the corneal reflection data, and the second storing the pupil edge data. In practice only two of the four pages are used.
- a content addressable memory is a RAM memory which has been organized as an array of brightness values versus line number.
- the data in the CAM cell is the dot number in force when the cell was written. This much can be seen in FIGS. 6 and 7.
- each CAM has 32 brightness bins by 64 lines. There are only 64 lines since it is storing data from each video field in an interlaced video environment. The data is from every other line, as compared to the data in video RAM.
- the brightness bin is formed by addressing CAM with only the five highest order brightness data bits, thereby grouping the data into bins of eight, since the low order three bits in the byte are not used.
- Bin zero contains data from zero to seven, bin one from eight to fifteen, etc., up to bin 31 from 248 to 255 inclusive. This much can be seen on the graphic plots of FIGS. 6 and 7.
- FIGS. 6 and 7 are graphical representations of the pattern recognition tasks for gaze measurement, the finding of the corneal reflection and the finding of the pupil edges. It depicts CAM memory as an array, video Line 0 at the top and video Line 63 at the bottom. The horizontal axis represents the 32 video amplitude bins, where bin 0 has the range of 0 to 7, bin 1 has the range of 8 to 15, etc., until bin 31 has the range of 248 to 255 inclusive.
- White space in the graphical presentation indicates that the CAM cell contains a zero, the initial value in the CAM array.
- a plus sign indicates that the cell contains some non-zero data, the dot number in RAM that was in force when the CAM cell was last written.
- the same CAM cell is written more than once, with the dot number in force at that time. In this way the dot number of the last occurrence of the brightness is saved, pointing to the right edge of the pupil, since the video camera scans from left to right.
- CAM In the case of finding the reflection from the cornea, done in the first frame of camera data, CAM can be quickly searched line by line starting at the highest bin, level 31, and continuing the line by line search at reduced bin levels until non-zero data is found.
- the CAM data is shown in FIG. 6. Since CAM is cleared prior to use, any non-zero data in a CAM cell indicates that the brightness level did occur on that line.
- the first non-zero data in CAM points to the dot address in RAM and the line address in RAM is the same as the line address in CAM, taking into account the fact that the CAM line number is not interlaced. A pointer is thereby formed to the location of the corneal reflection data in RAM.
- the search will take 64 times 16, or 1K tests for non-zero data. Since the reflex LED is adjusted prior to the field test to produce a reflex near maximum brightness, the search is much shorter than the 1K worst case.
- Searching CAM for the first occurrence of non-zero data is a much faster task for the computer compared to evaluating the magnitude of the video data. CAM makes a significant speed increase possible.
- the data stored in a CAM cell is the dot number in RAM (0-127) for the last occurrence of the brightness data in the brightness bin on a specific line. Since the data from the video camera is scanned from left to right, the data in the CAM cell is the dot address of the right edge of the video pattern.
- the CAM is written at the same time as the RAM video data is stored, pointing to the location of the data in RAM.
- CAM can be quickly searched line by line starting at the highest bin, level 31. Since CAM is cleared prior to use, any non-zero data in a CAM cell indicates that the brightness level did occur on that line.
- the first non-zero data is the dot address in RAM and the line address in RAM is the same as the line address in CAM. A pointer is thereby formed to the location of the corneal reflection data in RAM.
- FIG. 6 shows the contents of CAM, initially cleared, when the video amplitude in the field is mostly low in amplitude, except for some lines which contain bright data. This is characteristic of the dark frame generated with only the corneal reflection generator LED on.
- Reading the contents of CAM therefore forms the horizontal (dot) pointer for RAM. In the case shown there was only one pixel with the highest brightness. The other two pixels of less brightness were not found since the search stopped when the first was found.
- the low level data, the eye at low illumination since the illuminators in the trial lens, or on the bowl, are off, can write into more than one bin per line. This indicates that the image has a brightness range which spans more than one CAM bin.
- FIG. 7 shows the contents of CAM, initially cleared, when the video amplitude in the field is mostly mid range, but also contains some dark data, characteristic of an image of the eye with an illuminated iris, the mid range data, and a dark pupil.
- the lines which contain the dark pupil data can be found by searching bin 1. Starting at Line 0 and searching down, the first line with non-zero datum is a line with dark data present. The location of the right edge of said data in RAM can be found by forming a pointer using the value in CAM.
- the RAM data stored during Frame 2 is read backwards into another CAM forming a similar CAM image. This occurs since reading the RAM data backwards is equivalent to a camera that scans right to left, making the last occurrence of a specific video level occur on the left edge. With both of these CAMs available the left and right edges of the pupil can be found on each line. This allows software to locate the pupil edge data in RAM and proceed to find the zero crossing of the second derivative of the video data.
- the video data may change so rapidly when the transition from the iris to the pupil occurs that it passes through a bin without being clocked. For this reason the CAM system works best if the lowest bin levels are used. It is likely that the flat bottom pupil will indicate the transition from pupil to iris since the pupil data does not change rapidly near the pupil level.
- the CAM level for finding the pupil edges is chosen by searching the lines, starting at the lowest bin level, and moving up bin levels until a non-zero datum is found. This bin and the bin above same are used in conjunction to find the edges. The bin with datum which indicates a larger pupil is used. This allows the pupil data to split a level, some of the data in one level, and some of the data above that level, without error.
- the pupil falls into just a few bins since it is dark.
- the iris falls into many bins since it has many brightness levels.
- the trial lens holder cannot be included in the video window. It would appear dark, and would be the last occurrence of said dark data, writing over the pupil data. It is important that the pupil be the only dark data in the video window XX.
- CAM can be searched at a low bin level to find the dark pupil area.
- the pointer in CAM will point to the edge of the pupil opening where the brightness increases when the iris occurs and the brightness data no longer falls into the pupil brightness bin.
- the last value written into the CAM cell forms a pointer to RAM were the edge of the pupil can be found on that line.
- the video data is interlaced, that is, half of the image (field) is sent, scanned left to right, followed by the interlaced field (one half line down), to form a complete image (frame). This is part of the standard television transmission method.
- the data from one frame is stored in two video RAMS, one for each field, and two CAMS.
- the video data is stored in the first RAM and CAM.
- the CAM contains pointers to the right edge of the pupil.
- the video data is stored in the second RAM and the first RAM data is read backwards into the second CAM.
- the first field is used to find the right and left edges of the pupil, since reading the RAM data backwards causes the last occurrence of a specific brightness bin to occur on the left edge of the pupil.
- RAM contains an interlaced image of the dark pupil ready for detailed examination and the CAMs contain pointers to both the left and right pupil edges.
- a constant is added to the right pointer and subtracted from the left pointer to form starting points for the determination of the zero crossing of the second derivative, as outlined in the original disclosure.
- the CAM level used to find the dark pupil is determined by searching one of the CAMS, line by line, from bin 0 to bin 7, to find the first non-zero cell.
- the level with the first CAM data is the proper bin to use (a flat bottom is assumed).
- the bin selected and the bin above same are used in conjunction to find the left and right pointers to the pupil edges for each line.
- a pointer is found to the location in RAM of the corneal reflection.
- a small box is centered around the pointer, expected to contain all of the corneal reflection data.
- the box is 16 pixels by 16 lines. Such data is spread over more than one pixel, especially if the video camera is not perfectly focused or the corneal reflection is very bright.
- the data in the box is added together to form an amplitude sum.
- the data is also multiplied by the dot number and the line number and each product totalized. The resultant three sums are used for the determination of the centroid.
- centroid is determined by dividing each of the product sums by the amplitude sum to form two results, the sub-pixel dot and line location of the corneal reflection centroid. This method is termed "the weighted average method”.
- CAM CAM allows the entire process, both gathering of the video data and analysis of said data, to be accomplished during the time the shutter is open, the time when the point is presented to the patient. This allows a cost reduction as compared to a higher speed computer without CAM, or a separate computer to analyze the data after the shutter is closed.
- the process of taking the derivative increases the sensitivity to noise in the video data, in particular where the transition is not smooth.
- the maximum value of the first derivative is found and used to limit the search for the zero crossing of the second derivative. See FIGS. 8 and 9.
- This method restricts the second derivative zero crossing search to exclude unwanted data caused by noise.
- the first derivative is less sensitive to noise than the second.
- An example of this exclusion can be found in FIG. 9, where noise 300, a premature dark area prior to the pupil (such as an eyelash covering the iris) is rejected since the maximum of the first derivative occurs at 301, the edge of the pupil. This restricts the second derivative zero crossing detection activity to a small area surrounding 301.
- the bright spot in the pupil opening, item 303, a corneal reflection from an unknown source was also rejected.
- the dark pupil on one video line can be considered as an inverted truncated triangle suspended from the brightness level of the iris.
- the CAM levels form horizontal lines intersecting the triangle at specific fixed levels and the data in each of the 32 CAM cells is the horizontal position on that line of the intersection with the video data.
- Vertical lines also exist, the time at which the video data is sampled by the digital frame grabber hardware and stored in CAM.
- the data saved in a specific CAM cell occurs at the intersection of a vertical clock line and the video data.
- CAM level 12 FIG. 8 which intersects the data twice.
- the horizontal position would be written into CAM cell 12 twice, the final value being the last written value, position 400 for a left to right scan, or position 401 for a right to left scan.
- the slope of the triangle's edge can be abrupt. If the number of bins is large, the horizontal lines are close together, the pupil edge may fail to write into a specific bin since that particular level was not sampled. The bottom of the pupil may cause many CAM levels to be written. This would require a later test to find the level which contained a dot number nearest the edge.
- the brightness value of the pupil can have many values during the video sweep due to noise and may be placed in two bins. This sharing is caused by forcing the data into digital bins, and the amount of sharing is data dependent.
- the method used for finding the CAM bin of interest is to search the video lines at the lowest bin level and increment the bin level and continue the search.
- the first non-zero data indicates that pupil brightness data occurred on that line. This may be a single point where the pupil dipped down into the CAM level and the level may not contain the left and right edges. To get the pupil edges, the CAM level determined above is combined with the next higher level.
- the method of data evaluation selects the data from the CAM level which forms the largest pupil opening. The left most data is used for the left edge and the right most data is used for the right edge.
- FIG. 8 illustrates a single video line crossing the pupil.
- the bright video value near the top of the page represents the iris and the dark video value near the bottom of the page represents the pupil. Also shown is the first and second derivatives of the data. These may be easily obtained utilizing standard software techniques.
- the obtaining of the first derivative it has been found advantageous to use the data from every other video dot for finding the pupil edge, or every other video line for finding the pupil bottom to develop the derivative information. For example, utilizing data from video dot or line 0 and 2, 1 and 3, 2 and 4 produces a more pronounced derivative with freedom from noise.
- the data is clocked into digital memory at each of the vertical lines (FIG. 8 and 9), and the brightness at that time is stored in RAM and is used to select the CAM bin. If the data falls into a specific CAM bin, the dot number (horizontal position) is written into that CAM bin.
- the first derivative has a single peak and the second derivative changes sign at the horizontal position of maximum vertical slope. This is selected as best horizontal position for measuring the pupil edges.
- the dark pupil data falls into two CAM levels and that searching the first lowest level with non-zero data does not return accurate edge positions.
- FIG. 9 is similar to FIG. 8 except a spurious disturbance has been introduced in the iris and the pupil. If the maximum of the first derivative is used to point to the search area for the second derivative, the pupil edge is found instead of the spurious data. If the search area for the second derivative covers only a small number of pixels, the spurious data is not detectable when searching for the zero crossing of the second derivative.
- spurious signal in the iris occurs in practice, caused by a dark speck in the iris, or an eyelash.
- the spurious signal in the pupil area is caused by a unwanted reflection from the cornea. In both cases the spurious signal would be rejected since the first derivative of the data is smaller than that created by the pupil to iris transition.
- the next step in the process is to bisect chord 68 and starting at said bisect scan down the RAM data to find the transition from the dark pupil to the lighted iris area at the bottom of the pupil 60. This generates the vertical distance (y).
- Finding of the vertical distance (y) is precisely analogous to finding one of the edges of the scans of FIGS. 8 and 9. To avoid repetition, such a procedure will not be further described here.
- the horizontal distance (x) is the length of the chord from the bisect to one end of the chord 68.
- the calculation is based on the Pythagorean theorem. As is well known, the sum of the squares of the two sides of a right triangle equals the square of the hypotenuse. The triangle is shown in FIG. 5A formed by half of the chord 68, the difference between the vertical component and the unknown radius (y-r), and the unknown radius (r) as the hypotenuse. This is only true if the pupil is a true circle, an assumption for this measurement. The mathematical method applies equally well to an arbitrary chord 68 placed below the pupil 60 center.
Abstract
Description
Claims (13)
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US08/172,136 US5491757A (en) | 1993-12-22 | 1993-12-22 | Field tester gaze tracking using content addressable memories to improve image data analysis speed |
ES94116372T ES2225826T3 (en) | 1993-12-22 | 1994-10-18 | IMPROVEMENTS IN THE FOLLOW-UP OF THE LOOK IN A VISUAL FIELD CHECKER. |
DE1994633918 DE69433918T2 (en) | 1993-12-22 | 1994-10-18 | Gaze tracking improvements for perimeters |
EP19940116372 EP0659382B1 (en) | 1993-12-22 | 1994-10-18 | Improvements in visual field tester gaze tracking |
JP32066294A JP3670695B2 (en) | 1993-12-22 | 1994-12-22 | Video image information processing device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US08/172,136 US5491757A (en) | 1993-12-22 | 1993-12-22 | Field tester gaze tracking using content addressable memories to improve image data analysis speed |
Publications (1)
Publication Number | Publication Date |
---|---|
US5491757A true US5491757A (en) | 1996-02-13 |
Family
ID=22626520
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US08/172,136 Expired - Lifetime US5491757A (en) | 1993-12-22 | 1993-12-22 | Field tester gaze tracking using content addressable memories to improve image data analysis speed |
Country Status (5)
Country | Link |
---|---|
US (1) | US5491757A (en) |
EP (1) | EP0659382B1 (en) |
JP (1) | JP3670695B2 (en) |
DE (1) | DE69433918T2 (en) |
ES (1) | ES2225826T3 (en) |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5644642A (en) * | 1995-04-03 | 1997-07-01 | Carl Zeiss, Inc. | Gaze tracking using optical coherence tomography |
US5790235A (en) * | 1997-03-26 | 1998-08-04 | Carl Zeiss, Inc. | Method and apparatus to measure pupil size and position |
US5852489A (en) * | 1997-12-23 | 1998-12-22 | Chen; Chi | Digital virtual chiasm for controlled stimulation of visual cortices |
US6049486A (en) * | 1999-01-04 | 2000-04-11 | Taiwan Semiconductor Manufacturing Company | Triple mode erase scheme for improving flash EEPROM cell threshold voltage (VT) cycling closure effect |
US6204828B1 (en) | 1998-03-31 | 2001-03-20 | International Business Machines Corporation | Integrated gaze/manual cursor positioning system |
US6381339B1 (en) * | 1997-01-15 | 2002-04-30 | Winton Emery Brown | Image system evaluation method and apparatus using eye motion tracking |
US20040070728A1 (en) * | 2001-02-21 | 2004-04-15 | Roland Bergner | Method for determining distances in the anterior ocular segment |
US20040196433A1 (en) * | 2001-08-15 | 2004-10-07 | Durnell L.Aurence | Eye tracking systems |
US20050254009A1 (en) * | 2004-05-12 | 2005-11-17 | Chris Baker | Motorized patient support for eye examination or treatment |
US20100149488A1 (en) * | 2007-03-08 | 2010-06-17 | Patrick Lo | Apparatus and method for objective perimetry visual field test |
WO2012123549A1 (en) | 2011-03-17 | 2012-09-20 | Carl Zeiss Meditec Ag | Systems and methods for refractive correction in visual field testing |
WO2012146710A1 (en) | 2011-04-28 | 2012-11-01 | Carl Zeiss Meditec Ag | Sytems and method for improved visual field testing |
US20130265544A1 (en) * | 2010-10-15 | 2013-10-10 | Universidad De Murcia | Instrument for rapid measurement of the optical properties of the eye in the entire field of vision |
US9179833B2 (en) | 2013-02-28 | 2015-11-10 | Carl Zeiss Meditec, Inc. | Systems and methods for improved ease and accuracy of gaze tracking |
US9261959B1 (en) | 2013-03-28 | 2016-02-16 | Google Inc. | Input detection |
US9349944B2 (en) | 2013-12-27 | 2016-05-24 | Samsung Electronics Co., Ltd. | Magnetic tunnel junction device |
US20160213551A1 (en) * | 2015-01-22 | 2016-07-28 | Ovard, Llc | Gaze stabilization system and method |
US10058241B2 (en) | 2016-02-29 | 2018-08-28 | Carl Zeiss Meditec, Inc. | Systems and methods for improved visual field testing |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1407710B1 (en) | 2002-10-08 | 2005-08-10 | Inami & Co., Ltd. | Computer controlled perimetry system |
GB2398631A (en) * | 2002-12-19 | 2004-08-25 | Ahmed A Assaf | Computerised assessment of the ocular motility fileds |
US7575322B2 (en) * | 2007-05-11 | 2009-08-18 | Amo Development Llc. | Auto-alignment and auto-focus system and method |
WO2023220148A1 (en) * | 2022-05-10 | 2023-11-16 | Mayo Foundation For Medical Education And Research | Adjustable chin rest apparatus for visual field system |
Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US1960111A (en) * | 1928-11-26 | 1934-05-22 | American Optical Corp | Eye testing device |
US4145123A (en) * | 1974-08-30 | 1979-03-20 | Optische Werke G. Rodenstock | Perimeter |
US4429961A (en) * | 1981-08-14 | 1984-02-07 | Sheingorn Larry A | Visual field testing device |
US4748502A (en) * | 1986-08-18 | 1988-05-31 | Sentient Systems Technology, Inc. | Computer vision system based upon solid state image sensor |
US4854694A (en) * | 1986-06-06 | 1989-08-08 | Kowa Company Limited | Eye fixation monitor |
US4928260A (en) * | 1988-05-11 | 1990-05-22 | Advanced Micro Devices, Inc. | Content addressable memory array with priority encoder |
US4950069A (en) * | 1988-11-04 | 1990-08-21 | University Of Virginia | Eye movement detector with improved calibration and speed |
US4973149A (en) * | 1987-08-19 | 1990-11-27 | Center For Innovative Technology | Eye movement detector |
US5008946A (en) * | 1987-09-09 | 1991-04-16 | Aisin Seiki K.K. | System for recognizing image |
US5066117A (en) * | 1985-02-26 | 1991-11-19 | Canon Kabushiki Kaisha | Perimeter |
US5214456A (en) * | 1991-10-09 | 1993-05-25 | Computed Anatomy Incorporated | Mapping of corneal topography with display of pupil perimeter |
US5220361A (en) * | 1991-06-05 | 1993-06-15 | Allergan Humphrey | Gaze tracking for field analyzer |
WO1993014692A1 (en) * | 1992-01-30 | 1993-08-05 | Mäk Technologies, Inc. | High speed eye tracking device and method |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
IL82112A0 (en) * | 1986-04-10 | 1987-10-30 | Techna Vision Inc | Optical-mechanical system for an automated perimeter |
US4836670A (en) * | 1987-08-19 | 1989-06-06 | Center For Innovative Technology | Eye movement detector |
US5231674A (en) * | 1989-06-09 | 1993-07-27 | Lc Technologies, Inc. | Eye tracking method and apparatus |
IT1231669B (en) * | 1989-07-31 | 1991-12-18 | Lectrikon Srl | MECHANICAL DEVICE FOR POSITIONING THE PATIENT'S HEAD IN ELECTROMEDICAL APPARATUS |
US5257220A (en) * | 1992-03-13 | 1993-10-26 | Research Foundation Of The State Univ. Of N.Y. | Digital data memory unit and memory unit array |
-
1993
- 1993-12-22 US US08/172,136 patent/US5491757A/en not_active Expired - Lifetime
-
1994
- 1994-10-18 DE DE1994633918 patent/DE69433918T2/en not_active Expired - Lifetime
- 1994-10-18 ES ES94116372T patent/ES2225826T3/en not_active Expired - Lifetime
- 1994-10-18 EP EP19940116372 patent/EP0659382B1/en not_active Expired - Lifetime
- 1994-12-22 JP JP32066294A patent/JP3670695B2/en not_active Expired - Fee Related
Patent Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US1960111A (en) * | 1928-11-26 | 1934-05-22 | American Optical Corp | Eye testing device |
US4145123A (en) * | 1974-08-30 | 1979-03-20 | Optische Werke G. Rodenstock | Perimeter |
US4429961A (en) * | 1981-08-14 | 1984-02-07 | Sheingorn Larry A | Visual field testing device |
US5066117A (en) * | 1985-02-26 | 1991-11-19 | Canon Kabushiki Kaisha | Perimeter |
US4854694A (en) * | 1986-06-06 | 1989-08-08 | Kowa Company Limited | Eye fixation monitor |
US4748502A (en) * | 1986-08-18 | 1988-05-31 | Sentient Systems Technology, Inc. | Computer vision system based upon solid state image sensor |
US4973149A (en) * | 1987-08-19 | 1990-11-27 | Center For Innovative Technology | Eye movement detector |
US5008946A (en) * | 1987-09-09 | 1991-04-16 | Aisin Seiki K.K. | System for recognizing image |
US4928260A (en) * | 1988-05-11 | 1990-05-22 | Advanced Micro Devices, Inc. | Content addressable memory array with priority encoder |
US4950069A (en) * | 1988-11-04 | 1990-08-21 | University Of Virginia | Eye movement detector with improved calibration and speed |
US5220361A (en) * | 1991-06-05 | 1993-06-15 | Allergan Humphrey | Gaze tracking for field analyzer |
US5214456A (en) * | 1991-10-09 | 1993-05-25 | Computed Anatomy Incorporated | Mapping of corneal topography with display of pupil perimeter |
WO1993014692A1 (en) * | 1992-01-30 | 1993-08-05 | Mäk Technologies, Inc. | High speed eye tracking device and method |
Non-Patent Citations (4)
Title |
---|
ISCAN Eye Movement Monitoring Research Laboratory, brochure, 1989. * |
ISCAN® Eye Movement Monitoring Research Laboratory, brochure, 1989. |
Myers, Glenn A., et al, "Eye Monitor", IEEE Journal, Mar. 1991, pp. 14-21. |
Myers, Glenn A., et al, Eye Monitor , IEEE Journal, Mar. 1991, pp. 14 21. * |
Cited By (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5644642A (en) * | 1995-04-03 | 1997-07-01 | Carl Zeiss, Inc. | Gaze tracking using optical coherence tomography |
US6381339B1 (en) * | 1997-01-15 | 2002-04-30 | Winton Emery Brown | Image system evaluation method and apparatus using eye motion tracking |
US5790235A (en) * | 1997-03-26 | 1998-08-04 | Carl Zeiss, Inc. | Method and apparatus to measure pupil size and position |
US5852489A (en) * | 1997-12-23 | 1998-12-22 | Chen; Chi | Digital virtual chiasm for controlled stimulation of visual cortices |
US6204828B1 (en) | 1998-03-31 | 2001-03-20 | International Business Machines Corporation | Integrated gaze/manual cursor positioning system |
US6049486A (en) * | 1999-01-04 | 2000-04-11 | Taiwan Semiconductor Manufacturing Company | Triple mode erase scheme for improving flash EEPROM cell threshold voltage (VT) cycling closure effect |
US20040070728A1 (en) * | 2001-02-21 | 2004-04-15 | Roland Bergner | Method for determining distances in the anterior ocular segment |
US7284858B2 (en) * | 2001-02-21 | 2007-10-23 | Carl Zeiss Meditec Ag | Method for determining distances in the anterior ocular segment |
US7391887B2 (en) * | 2001-08-15 | 2008-06-24 | Qinetiq Limited | Eye tracking systems |
US20040196433A1 (en) * | 2001-08-15 | 2004-10-07 | Durnell L.Aurence | Eye tracking systems |
US20050254009A1 (en) * | 2004-05-12 | 2005-11-17 | Chris Baker | Motorized patient support for eye examination or treatment |
US7401921B2 (en) | 2004-05-12 | 2008-07-22 | Carl Zeiss Meditec, Inc. | Motorized patient support for eye examination or treatment |
US20100149488A1 (en) * | 2007-03-08 | 2010-06-17 | Patrick Lo | Apparatus and method for objective perimetry visual field test |
US8500278B2 (en) | 2007-03-08 | 2013-08-06 | Liang Chen | Apparatus and method for objective perimetry visual field test |
US9167965B2 (en) * | 2010-10-15 | 2015-10-27 | Universidad De Murcia | Instrument for rapid measurement of the optical properties of the eye in the entire field of vision |
US20130265544A1 (en) * | 2010-10-15 | 2013-10-10 | Universidad De Murcia | Instrument for rapid measurement of the optical properties of the eye in the entire field of vision |
WO2012123549A1 (en) | 2011-03-17 | 2012-09-20 | Carl Zeiss Meditec Ag | Systems and methods for refractive correction in visual field testing |
US8668338B2 (en) | 2011-03-17 | 2014-03-11 | Carl Zeiss Meditec, Inc. | Systems and methods for refractive correction in visual field testing |
WO2012146710A1 (en) | 2011-04-28 | 2012-11-01 | Carl Zeiss Meditec Ag | Sytems and method for improved visual field testing |
US8684529B2 (en) | 2011-04-28 | 2014-04-01 | Carl Zeiss Meditec, Inc. | Systems and methods for improved visual field testing |
US9179833B2 (en) | 2013-02-28 | 2015-11-10 | Carl Zeiss Meditec, Inc. | Systems and methods for improved ease and accuracy of gaze tracking |
US9872615B2 (en) | 2013-02-28 | 2018-01-23 | Carl Zeiss Meditec, Inc. | Systems and methods for improved ease and accuracy of gaze tracking |
US10376139B2 (en) | 2013-02-28 | 2019-08-13 | Carl Zeiss Meditec, Inc. | Systems and methods for improved ease and accuracy of gaze tracking |
US9261959B1 (en) | 2013-03-28 | 2016-02-16 | Google Inc. | Input detection |
US9349944B2 (en) | 2013-12-27 | 2016-05-24 | Samsung Electronics Co., Ltd. | Magnetic tunnel junction device |
US20160213551A1 (en) * | 2015-01-22 | 2016-07-28 | Ovard, Llc | Gaze stabilization system and method |
US10716730B2 (en) * | 2015-01-22 | 2020-07-21 | Ovard, Llc | Gaze stabilization system and method |
US10058241B2 (en) | 2016-02-29 | 2018-08-28 | Carl Zeiss Meditec, Inc. | Systems and methods for improved visual field testing |
Also Published As
Publication number | Publication date |
---|---|
ES2225826T3 (en) | 2005-03-16 |
JP3670695B2 (en) | 2005-07-13 |
JPH07194549A (en) | 1995-08-01 |
EP0659382A2 (en) | 1995-06-28 |
DE69433918D1 (en) | 2004-09-02 |
EP0659382B1 (en) | 2004-07-28 |
EP0659382A3 (en) | 1998-09-02 |
DE69433918T2 (en) | 2005-08-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US5491757A (en) | Field tester gaze tracking using content addressable memories to improve image data analysis speed | |
US5220361A (en) | Gaze tracking for field analyzer | |
US4019813A (en) | Optical apparatus for obtaining measurements of portions of the eye | |
US8113658B2 (en) | Ophthalmic diagnostic instrument | |
US7572008B2 (en) | Method and installation for detecting and following an eye and the gaze direction thereof | |
US6206522B1 (en) | Apparatus for evaluating the visual field of a patient | |
CA1154988A (en) | Method and apparatus for analysis of corneal shape | |
US4993826A (en) | Topography measuring apparatus | |
US5106183A (en) | Topography measuring apparatus | |
US5841511A (en) | Method of corneal analysis using a checkered placido apparatus | |
KR100992182B1 (en) | Ophthalmic binocular wavefront measurement system | |
US4902123A (en) | Topography measuring apparatus | |
EP0395831A1 (en) | Topography measuring apparatus | |
CN101596096A (en) | Heed contacted measure eyes axial length and/or corneal curvature and/or anterior chamber depth, the apparatus and method measured of IOL especially | |
CN1395902A (en) | Cornea measuring equipment by optical coherent chromatography X-ray photographic method | |
US6042232A (en) | Automatic optometer evaluation method using data over a wide range of focusing positions | |
CA2990524C (en) | Purkinje meter and method for automatic evaluation | |
Levine | Performance of an eyetracker for office use | |
JP3387500B2 (en) | Checkered plaseeding device | |
JP3594466B2 (en) | Eye refractive power measuring device | |
JPH04200524A (en) | Contact lens position correcting device for measuring eyeball movement | |
Schaeffel et al. | Measurement of pupil size, direction of gaze, and refractive state by on-line analysis of digitized video images | |
Augustyniak et al. | Complete scanpaths analysis toolbox |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: HUMPHREY INSTRUMENTS, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LEHMER, DONALD E.;KIRSCHBAUM, ALAN R.;REEL/FRAME:006907/0397 Effective date: 19940110 |
|
AS | Assignment |
Owner name: CARL ZEISS, INC., NEW YORK Free format text: MERGER;ASSIGNOR:HUMPHREY INSTRUMENTS, INC.;REEL/FRAME:007709/0901 Effective date: 19940930 |
|
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FEPP | Fee payment procedure |
Free format text: PAYER NUMBER DE-ASSIGNED (ORIGINAL EVENT CODE: RMPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
FEPP | Fee payment procedure |
Free format text: PAYER NUMBER DE-ASSIGNED (ORIGINAL EVENT CODE: RMPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
FPAY | Fee payment |
Year of fee payment: 8 |
|
FPAY | Fee payment |
Year of fee payment: 12 |