PHOTOGRAMMETRIC ENGINEERING
for processing the images of successive points
or places on the earth’s surface, to depict at a
desired scale the portion of the earth covered.
Progress is also being made toward reduc-
ing if not eliminating the human operator in
the stereoplotting operation.
The development of the Orthophotoscope
by Mr. Bean of the U. 5. Geological Survey
has provided the necessary step for portray-
ing the minute details of sensed information
graphically in their correct relative positions.
Means of automatically discriminating be-
tween desirable and undesirable minute de-
tails of sensed information have not been
demonstrated satisfactorily yet, but time and
effort are being expended toward this objec-
tive.
Progress has been and will continue to be
made in automatically, or semi-automati-
cally, converting collected data of a portion of
the earth’s surface back into the three dimen-
sional form at the scales desired.
Transmission and representation of data
between remote points have met with some
success and are gradually improving.
In many of thesestepsof the mapping proc-
ess just enumerated, the electronic computer
has or will play an important part.
In analyzing these steps on a comparative
basis as to readiness for utilization, either
successively as automatic or semi-automatic
operations in a mapping process, or as in-
tegral parts of one overall system, the status
seems to be as follows:
For fixing the vehicle or actually the sensor
in space the distance measurements, attitude
determination, and environmental effects can
all be recorded and fed into a computer along
with geodetic data on points of origin. As
mentioned before, the Hiran System is at the
point where practical application on a large
scale to iron out procedures is about all that
remains. Also for operations covering small
areas at lower altitudes the recently devel-
oped airborne Tellurometer, with its ability
to record distances simultaneously and re-
peatedly from three ground stations, offers
great possibilities for fixing the sensor in
space at low cost. Unmanned air vehicles can
also be considered for this application in the
future. This step, or component of a mapping
system, therefore appears to be at the stage
where the degree of automatization depends
on the requirements of other components, the
coordination with them, and the practical
economy of operation.
With regard to sensors, the photographic
processes are now adequate and certain of the
electronic types of sensors show promise. The
)
situation appears to be that adequate results
can be obtained by a photographic sensor
with the degree of automatization determined
by the other components, provided the
weather 1s A greater degree of
automatization, and a lesser dependence on
favorable.
weather conditions, can be obtained from cer-
tain electronic sensors, provided a compro-
mise is made as to quality and minuteness
of detail. Considerable time and effort can be
expected to be expended before electronic
sensors produce results acceptable to users
who are accustomed to photographic prod-
ucts.
[n the so-called map compiling step a num-
ber of advancements have taken place. With
the development of electronic computers the
more or less dormant practice of analytical
photogrammetry took on new life. Stereo-
comparators have been redesigned and at
least partially automatized. Direct or in-
direct linkage with computers is under experi-
mentation for performing analytical triangu-
lation. The Helava development illustrates
from optical-me-
chanical to optical-electronic stereoplotting.
Then we have quite a bit of activity in the
automatic stereoplotting field. Since this ob-
the possible departure
jective has been of interest to many and the
subject of investigation and experimentation
in several places I will discuss that of which I
have knowledge with no assurance as to com-
pleteness or chronological order.
[f we wish to perform this operation auto-
matically we must determine a way of uti-
lizing electronics, mechanical means or a com-
bination of both. We must determine the z co-
ordinate for each xy position in the model. If
we can determine the point of intersection of
each pair of conjugate rays in the model
electrically, while we mechanically scan the
model, we will have accomplished our pur-
pose. There are several approaches, both in
the method of scanning the model and the
method of determining the point of conjugate
ray intersection, or to use a more familiar
term, the point of image match. Let us first
consider scanning the model. The scan can be
made to traverse throughout the entire area
occupied by the model determining the points
of match only, or a match decision can be
made as each point is encountered, and the
data used to direct the scan generally towards
the next most probable match point, i.e. along
the surface of the model.
Now let us consider the determination of
the conjugate ray intersection or image
match. Where the human eye makes the
match optically, we must now determine the