Okay, assuming you're using something like the CO2 laser at the top of
this page, for your laser, you're looking at a 1.5mm diameter spot, with a divergence of ~1 milliradian.
Taking the numbers you've given me for working range, assuming 100mm is minimum distance, we have a beam divergence of ~ 0.2 mm increase in the rated spot size from vertical illumination (laser is perpendicular to the surface). This goes up to ~1mm at 500mm, again vertical illumination.
If we're assuming that the 10cm is the height above the surface, and the 50cm is the maximum possible hypotenuse distance (so the laser will be scanned up to a maximum angle of 1.37 rad or ~43.6
o from the vertical), thats a little trickier to calculate. My original attempt at a calc for this gave weird numbers, so we'll try something else:
Simplifying it to a 45
o or pi/2 radian angle, you're going to get something like an outer radius of the angled spot being sqrt(2)*radius of the flat one. It'll be less for the inner radius, but we'll assume they're equal as the worst case scenario. So, a 1.5mm diameter spot will swell to ~2.1mm even if there is no beam divergence whatsoever. With the listed divergence, we're looking at a spot size as large as 3.5mm diameter, basically doubling the spot size compared to the vertical case.
As the angle increases, so too will your spot size. While divergence will have a larger effect in absolute terms with increasing angle, I *think* the fact that you're taking an angled slice will always have a greater influence, *especially* at higher angles.
So, in short, your beam divergence (which is what collimation is all about) shouldn't be a problem here. The big problem is gonna lie in the simple fact that you're taking conic slices (cylindrical in the ideally collimated case) at increasingly sharp angles. If you want to limit yourself to smaller angles though, this might work.