UV rays are just energy at a certain wavelength. UVB = 280-315nm, UVA = 315-380nm.
I used a Shimadzu 3600 UV-Vis-NIR Spectrophotometer, with a fully traceable qualification each month and yearly calibration by the vendor. I got it to measure the full spectrum range it’s capable of (185nm to somewhere above 3Knm) through the lens of the specs I was testing.
Slightly dumbed down, but it fires a beam of energy, which is split into two streams. Stream One is used as a reference, the system is normally used to measure liquid sample solutions so it would normally be used to measure the solvent, but in this case it was left blank.
Stream two is used to measure the transmission of energy through the sample (which would be sample plus solvent in normal use), in this case the lens of the specs.
This is all done in absolute dark in the machine so you don’t get any light (energy) interference, Plus the sample compartment was pumped full of inert N2 as an extra belts and braces. You then subtract the background noise (Stream 1) to obtain the absolute transmission of Stream 2 for the lens.
This data is then plotted out over the whole of the wavelength range against transmission to give a graphical representation of the data. With the data I compared the UVB and UVA to the visual light region to ensure it passed the spec stated: 1% and 0.3X respectively.
I did worry about the angle of incidence (as we normally tests solutions which have cells set up perpendicular to the source), but even after playing around with the lens position the result obtained remained the same, giving me confidence the results obtained were genuine.
I hope that makes sense?