[关键词]
[摘要]
高光谱图像技术作为一种强有力的新兴技术,已应用于食品农产品品质与安全检测研究,然而高光谱图像中感兴趣区域形状大小的选择直接影响着检测的精度和稳定性。首先采集苹果330~1100 nm的高光谱图像,分别提取不同大小的圆形感兴趣区域和方形感兴趣区域的平均光谱,经光谱预处理以消除噪声及无关信息的影响,然后采用偏最小二乘法分别建立苹果的糖度定量分析模型,并以独立样本的预测集进行验证,分析感兴趣区域形状大小对高光谱图像建模精度的影响。结果表明,提取直径为150像素的圆形感兴趣区域建立的苹果糖度模型精度最高,预测能力最强,校正集相关系数Rc为0.9305,校正均方根误差RMSEC为0.4331,预测集相关系数Rp为0.9232,预测均方根误差RMSEP为0.4568。研究表明,针对研究对象选择合适形状和大小的感兴趣区域,对提高模型精度、发挥高光谱图像的技术优势具有重要意义。
[Key word]
[Abstract]
Hyperspectral imaging is an effective imaging technique that has been applied to quality and safety inspection of food and agricultural products. However, the selected shape and size of the region of interest (ROI) in hyperspectral imaging directly affects the accuracy and stability of the measurement. In this study, a hyperspectral imaging system was developed for wavelengths spanning from 330 to 1100 nm to acquire hyperspectral images of apple samples. Mean reflectance spectra of round and square ROIs with different sizes were extracted. After the spectral pretreatment to eliminate the impacts of noise and irrelevant information, models for quantitative analysis of the sugar content in apple were developed using partial least squares method. The models were externally verified with the prediction set consisting of independent samples, in order to analyze the impact of the shape and size of ROIs on the accuracy of hyperspectral imaging system modeling. The results showed that the model of sugar content in apple, constructed using a round ROI with 150 pixels diameter, yielded the highest accuracy and predictive capability. The correlation coefficient of calibration set (Rc) was 0.9305, the root mean square error of calibration (RMSEC) was 0.4331, the correlation coefficient of prediction set (Rp) was 0.9232, and the root mean square error of prediction (RMSEP) was 0.4568. These results demonstrate that selecting an ROI with an appropriate shape and size is important to improve the accuracy of modeling and harness the full potential of the hyperspectral imaging technique.
[中图分类号]
[基金项目]
国家自然科学基金资助项目(31301236);北京市农林科学院科技创新基金(CXJJ201314)