A near-infrared spectral reflectance system was developed and tested online to predict 14-day aged, cooked beef tenderness. A contact probe with a built-in tungsten-halogen light source supplied broadband light to the ribeye surface. Fiberoptics in the probe transmitted reflected light to a spectrometer with a spectral range of 400-2500 nm.
In the first phase, steak samples (n=292) were brought from packing plants to the lab and scanned with the spectrometer. After scanning, samples were vacuum-packaged and aged for 14 days. They were then cooked in an impingement oven to an internal temperature of 70°C. Slice-shear force values were recorded for tenderness reference.
In phase two, the spectrometer was modified for packing plant conditions. Spectral scans were obtained on-line on ribbed carcasses (n=276). A partial least square regression model was developed to predict tenderness scores from spectral reflectance. In phase three, the developed model was validated by scanning carcasses (n=200) on-line. The predicted shear-force values and samples were sent to the U.S. Meat Animal Research Center for third-party validation. At up to 70% certification levels, the system was able to successfully sort tough from tender carcasses.
A video image analysis system was developed to support automation of beef quality grading. Forty images of ribeye steaks were acquired. Fat and lean meat were differentiated using a fuzzy c-means clustering algorithm. Muscle longissimus dorsi (l.d.) was segmented from the ribeye using morphological operations. At the end of each iteration of erosion and dilation, a convex hull was fitted to the image and compactness was measured. The number of iterations was selected to yield the most compact l.d. Match between the l.d. muscle traced by an expert grader and that segmented by the program was 95.9%. Marbling and color features were extracted from the l.d. muscle and were used to build regression models to predict marbling and color scores. Quality grade was predicted using another regression model incorporating all features. Grades predicted by the model were statistically equivalent to the grades assigned by expert graders.
In recent work, we have demonstrated a prototype machine vision seedling inspection system which shows strong promise for automating production-line grading. Precise morphological measurements and accurate grade assignment require reliable identification of the seedling root collar location. The large variability of seedling morphology makes automatic root collar location the most challenging aspect of machine vision seedling inspection. This function is currently achieved using a heuristic algorithm which relies on many operator-controlled parameters to extract root collar location cues based on seedling shape. Artificial intelligence techniques, specifically, neural networks, have yielded excellent performance in similar pattern recognition applications. Neural networks were developed to locate the seedling root collar in digital images acquired by a machine vision inspection system. Several neural network architectures and input feature sets are evaluated. Input features consist of those used by the heuristic algorithm, plus additional features extracted from each line in the seedling image. The performance of several neural networks was superior to that of the heuristic algorithm. Good performance was achieved by networks which used local (single line) features along with normalized line number as inputs. A hierarchical network which took inputs from 15 lines over a 140-mm window provided improved performance in one case. The best networks identified the root collar location with an average error of less than 1 mm and an error standard deviation of 12 mm.
A PC-based machine vision system providing rapid measurement of bare-root tree seedling morphological features has been designed. The system uses backlighting and a 2048-pixel line- scan camera to acquire images with transverse resolutions as high as 0.05 mm for precise measurement of stem diameter. Individual seedlings are manually loaded on a conveyor belt and inspected by the vision system in less than 0.25 seconds. Designed for quality control and morphological data acquisition by nursery personnel, the system provides a user-friendly, menu-driven graphical interface. The system automatically locates the seedling root collar and measures stem diameter, shoot height, sturdiness ratio, root mass length, projected shoot and root area, shoot-root area ratio, and percent fine roots. Sample statistics are computed for each measured feature. Measurements for each seedling may be stored for later analysis. Feature measurements may be compared with multi-class quality criteria to determine sample quality or to perform multi-class sorting. Statistical summary and classification reports may be printed to facilitate the communication of quality concerns with grading personnel. Tests were conducted at a commercial forest nursery to evaluate measurement precision. Four quality control personnel measured root collar diameter, stem height, and root mass length on each of 200 conifer seedlings. The same seedlings were inspected four times by the machine vision system. Machine stem diameter measurement precision was four times greater than that of manual measurements. Machine and manual measurements had comparable precision for shoot height and root mass length.
Almost two billion conifer seedlings are produced in the U.S. each year to support reforestation efforts. Seedlings are graded manually to improve viability after transplanting. Manual grading is labor-intensive and subject to human variability. Our previous research demonstrated the feasibility of automated tree seedling inspection with machine vision. Here we describe a system based on line-scan imaging, providing a three-fold increase in resolution and inspection rate. A key aspect of the system is automatic recognition of the seedling root collar. Root collar diameter, shoot height, and projected shoot and root areas are measured. Sturdiness ratio and shoot/root ratio are computed. Grade is determined by comparing measured features with pre-defined set points. Seedlings are automatically sorted. The precision of machine vision and manual measurements was determined in tests at a commercial forest nursery. Manual measurements of stem diameter, shoot height, and sturdiness ratio had standard deviations three times those of machine vision measurements. Projected shoot area was highly correlated (r2 equals 0.90) with shoot volume. Projected root area had good correlation (r2 equals 0.80) with root volume. Seedlings were inspected at rates as high as ten per second.
A method for gauging the distance from a video camera to an object of interest is described. By using a
calibrated camera-lens system, range was related to focus of a selected object. Optimum focus of the image was
determined by maximizing the high-frequency content of the Fourier transform of the object image. The Walsh-
Hadamard transform was investigated as an alternative focusing function. Software was developed to determine
optimum image focus and control a motorized camera lens. Range values from the video camera to target objects
were calculated by the system. Calculated values were compared with measured distances. For any given distance,
the difference between calculated and actual distance averaged less than 1.2%. Distance values calculated using
the Walsh-Hadamard transform differed from values calculated with the Fourier transform by less than 1%.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.