Citation
(2009), "Patent abstracts", Sensor Review, Vol. 29 No. 1. https://doi.org/10.1108/sr.2009.08729aad.005
Publisher
:Emerald Group Publishing Limited
Copyright © 2009, Emerald Group Publishing Limited
Patent abstracts
Article Type: Patent abstracts From: Sensor Review, Volume 29, Issue 1
Title: Machine with artificial sight for automatic separation by colour of recyclable plastics, with multispectral sightApplicant: Picvisa Machine Vision Systems (ES)Patent number: EP1967294Publication date: September 10, 2008
Abstract
The machine with artificial vision for automatic separation by colour of recyclable plastic, with multispectral vision, which uses the ultraviolet, visible and infrared zones of the spectrum simultaneously, at wavelengths of less than 400 nm, between 400 and 700 nm, and greater than 700 nm., obtaining for each piece of plastic analysed its spectrogram, which is specific to each type of plastic. The machine as a whole is made up of a conveyer belt (1) for the plastics to be analysed, a line module (2) where the multispectral lighting and vision systems are located for image intake, an electronic control module with tactile screen (5) where processing cards are located, a computer system that carries out the multispectral analysis of the pieces of plastic and an electronic interface for controlling the expulsion electrovalves (4), a blowing system (3), and material conveyer belts (7) for extracting the main flow and the separated components.
Title: Multiple illumination sources to level spectral response for machine vision cameraApplicant: Lockheed Corp. (USA)Patent number: US2008017556Publication date: January 24, 2008
Abstract
In one embodiment, an apparatus comprises a camera and a group of light emitting diodes configured and arranged to illuminate an object imaged by the camera. The group of LEDs comprises at least one-first LED of a first type that emits light with a first spectrum profile and at least one second LED of a second type that emits light with a second spectrum profile that is substantially different than the first spectrum profile. The apparatus may further comprise an adjustment mechanism for adjusting of the relative amounts of illumination energy generated by the at least one first LED and the at least one second LED during each of a plurality of scan cycles of the camera.
Title: Whole-spectrum, double-CMOS linear imaging sensor detector and detecting method thereofApplicant: Maisuote Science & Technology (CN)Patent number: CN1888833Publication date: January 3, 2007
Abstract
A full spectrum double CMOS linear imaging sensor checking equipment and testing method relates to optics imaging lens, three layers spectral prism, double high-speed linear CMOS imaging sensor. Optics lens images object; right angle spectral prism filtrates infrared spectroscopic; separate out two balanced spectrums according to the prism spectral ratio T/R; photoelectric diversion, respectively, carries by two CMOS sensors; gives goal brightness signal L and color saturation signal S; completes the distinguish of color changing of goal. It takes place the strong and leak single test to blue spectrum (B) signal in common selecting color machine by color saturation signal S-test and it accords to the identify habit to color for the vision to people and enhances the identify ability of the equipment.
Title: Multiple spectrum meat freshness artificial intelligence measurement method and systemApplicant: Univ. Zhejiang (CN)Patent number: CN101059424Publication date: October 24, 2007
Abstract
The invention discloses a multi-spectrum artificial intelligent measuring method of meat fresh degree and a relative system, wherein an adjustable light source emits special light beam on a meat platform, a 3CCD multi-spectrum camera receives the light reflected from the meat, and outputs signal to an image pickup card to be transmitted to a computer processing system to be processed. The invention uses a meat database in the computer to select different meats to process various pretreatments, then extracts various characteristic wavelengths, selects pixel group as following research object, via artificial intelligent judgment, recognizes three different modes as fresh meat, sub-fresh meat and bad meat under prior model, and outputs recognition result to be displayed on computer. The invention combines machine vision, image process and artificial intelligence techniques, or the like, thereby judging quickly, non-damage, and accurately the fresh degree of meat.
Title: Motion tracking system for real time adaptive imaging and spectroscopyInventor: Ernst Thomas, M. (USA); Prieto Thomas, E. (USA); Armstrong Brian, S.R. (USA)Patent number: US2007280508Publication date: December 6, 2007
Abstract
Current magnetic resonance imager (MRI) technologies require subjects to remain largely motionless for achieving high-quality MR scans, typically for 5-10 min. at a time. However, lying absolutely still inside the tight MRI tunnel is a difficult task, especially for children, very sick patients, or the mentally ill. Even motion ranging less than 1 mm or 1° can corrupt a scan. This invention involves a system that adaptively compensates for subject motion in real-time. An object orientation marker, preferably a retro-grate reflector (RGR), is placed on a patients’ head or other body organ of interest during MRI. The RGR makes it possible to measure the 6 degrees of freedom (x, y, and z-translations, and pitch, yaw, and roll), or “pose,” required to track the organ of interest. A camera-based tracking system observes the marker and continuously extracts its pose. The pose from the tracking system is sent to the MR scanner via an interface, allowing for continuous correction of scan planes and position in real-time. The RGR-based motion correction system has significant advantages over other approaches, including faster tracking speed, better stability, automatic calibration, lack of interference with the MR measurement process, improved ease of use, and long-term stability. RGR-based motion tracking can also be used to correct for motion from awake animals, or in conjunction with other in vivo imaging techniques, such as computer tomography, positron emission tomography, etc.
Title: Combined visual-optic and passive infra-red technologies and the corresponding systems for detection and identification of skin cancer precursors, nevi and tumors for early diagnosisInventor: Passive Imaging Medical System (IL)Patent number: US2007073156Publication date: March 29, 2007
Abstract
A device and method to non-invasively identify pathological skin lesions. The method and device detect and identify of different kinds of skin nevi, tumors, lesions and cancers (namely, melanoma) by combined analyses of visible and infra-red optical signals based on integral and spectral regimes for detection and imaging leading earlier warning and treatment of potentially dangerous conditions.
Title: System and method for employing infrared illumination for machine visionApplicant: Cognex Technology and Invest C (USA); Jakoboski Timothy, A. (USA); Looney Brian (USA)Patent number: WO2007067543Publication date: June 14, 2007
Abstract
This invention provides a machine vision device adapted to read inscribed symbology on the surface of an object, such as a wafer, covered in photoresist that employs both bright field and dark field illumination in the infrared region. Using illumination with light in this spectral band, an inscribed symbol can be read by a camer sensor substantially unaffected by the presence of and/or number of layers of photoresist covering the symbol. The camera sensor is tuned to receive such illumination, and is thereby provided with an image that distinguishes the symbol’s scribe lines on the underlying wafer surface from the surrounding specular wafer surface. The device includes a housing that supports the imager and imager lens below an array of IR LEDs. The sensor has an optical axis that is reflected from horizontal to vertical by a mirror an then back to horizontal by a beam splitter that is aligned with two spherical lenses and a outlet window at the front of the housing. The array is located in line with lenticular arrays behind the beam splitter, along the central optical axis of the lenses and window.