Answer :
The collimation error can be determined by comparing the readings at the two points A and B. In the given observations, we have:Level midway between points A and B:
- Reading at A = 1.365 m
- Reading at B = 1.118 m
Level near one end (A) of the line:
- Reading at A = 1.572 m
- Reading at B = 1.317 m
To find the collimation error, we need to subtract the reading at point B from the reading at point A for both sets of observations. For the first set of observations:
Collimation error = Reading at A - Reading at B
= 1.365 m - 1.118 m
= 0.247 m
For the second set of observations:
Collimation error = Reading at A - Reading at B
= 1.572 m - 1.317 m
= 0.255 m
As we can see, the collimation error is not consistent between the two sets of observations. Therefore, the correct answer is option a) None of the given answers, as none of the provided options match the calculated collimation error values of 0.247 m and 0.255 m.It is important to note that collimation error is a measure of the horizontality of the collimation line of a level. In this case, the collimation error is the difference in height readings between the two points A and B.
To know more about collimation error visit:
https://brainly.com/question/32319329
#SPJ11