@david.n.forster @JFrame @mawada
I’m pulling a discussion happening in Tech over here, because it relates more to food quality than to tech.
So, there is a question as to what wavelengths we should measure in the lab on all of the food (carrots, spinach, etc.) that we run. 200 (UV) - 1450nm (NIR)? 200 (UV) - 2100nm (MIR)? There is a big difference in cost and complexity of the two. 1450 - 2100 is largely absorbed by water (where penetration depths are 50u based on what our friend with optical expertise said), so the value of this range is questionable in measuring food.
In addition, we can see detectors in the mid IR region becoming more available and easier to use - for example https://www.hamamatsu.com/us/en/product/category/3100/4018/C14272/index.html from Hamamatsu. So the cost of measuring that region will drop in the next 5 years.
So… it’s technically feasible.
It has been proposed that, given a good enough instrument, there may be signals laying over top of the water signal. And in fact, in some ways, those signals are more useful because they represent things which are in fact dissolved in water, which may include a wide range of useful nutrients (or signals which correlate with them).
The core question is - What work has been done to show there are useful signals in the mid IR range for food quality detection? Are there clever ways around the water problem? Could we use the baseline water signal as a useful point of data itself?
We need to evaluate this in order to know what equipment to buy and what methods to develop so we are maximizing the use of our time, but also planning for future available tech.