Least Squares: The Great Arbitrater

When it comes to evaluating surveying and geodesy measurements, many people rely on the mathematical sophistication and primacy of least squares network analysis and adjustment to provide the “final answer.” While a few surveyors may be confident in the tightness of the solutions produced by their instrumentation and view this idea of adjusting survey networks as unnecessary, a growing number of surveyors consider it so critical in their work that they apply least squares to every survey.

History

Least squares grew out of the field of astronomy, and it is widely accepted that Carl Gauss first proposed it more than two centuries ago. The term “Gaussian distribution,” that might be familiar to surveyors as the “normal distribution” bell curve of statistical uncertainty fundamental to the evaluation of surveying measurements, owes its name to Gauss. The concept was refined by others and adapted for use beyond astronomy; geodesy and surveying are but two of the many uses.

“Least squares” is a broad term for an array of methods that take the concept of error distribution and apply it across many correlated measurements, even those evaluated in a complex network. It provides statistically the most-likely location of multiple measured points, with the option of varying the “weight” of selected measurements. Basically, it’s a way to find a “best fit” of correlated measurements where the minimum sum of the squares of their errors may be determined. 

That can represent a tremendous amount of computation, or, as some have characterized least squares, “mathematics that no mortal could do by hand.” Indeed, it was not until powerful personal computing became inexpensive and readily available that least squares landed in the toolbox of many surveyors. 

Legacy methods for evaluating and adjusting closed-loop or otherwise constrained traverses were practical to compute by hand, and they had been for centuries. However, when cross ties were added to traverses and post-processed GPS network campaigns of perhaps thousands of observations were considered, the legacy methods showed their weaknesses. The historic confluence of many technologies in the 1980s and 90s—digital data collection, inexpensive, powerful processors, GPS, and software development—created ideal conditions for the rise of least squares.

Use and Misuse

Of course, such a powerful tool can be misused. Some people greet new technologies with skepticism, especially those that can lead to “button-pushing” practices. Least squares software packages are not immune to this. Proper training and adoption of best practices is essential, especially as overconfidence in the increasing tightness of instrumentation can lead surveyors to, for example, fly-tie rather than close traverses. Skipping steps for evaluating residuals before adjustment and hitting the adjust button until it looks good is just one of the ways such tools can be abused. Proper weighting can often be highly subjective and wholly misunderstood; manipulating weights to achieve that perfect chi test result can be quite tempting.When used properly, least squares gives surveyors solid assurance that the results it yields best reflect the true quality of a survey’s measurements. According to Curt Sumner, LS, executive director of the National Society of Professional Surveyors, “It allows one to assign the appropriate weights to measurements based on the ‘quality’ of a given measurement. 

Compass rule treats every measurement as equally precise, whereas least squares can account, for example, for the measurement that was made with a 30-second transit versus a 1-second total station, or the measurement made to a prism on a tripod versus a prism on top of 12 feet of range pole, by giving more weight to the higher quality measurement. Thus, the results of the least squares adjustment should better reflect the true quality of the survey measurements.”

STAR*NET

Born from the era of early adoption of least squares by surveyors and geodesists in the 1980s, several successful stand-alone least squares software packages are now available. They have been integrated into popular survey-civil software suites. STAR*NET holds particular distinction as one of the earliest commercial solutions of its kind and as one of the most recognized, lauded, and popular. It is not uncommon to hear the phrase “run it through STAR*NET” when someone wants a “final” result. 

STAR*NET has recently been acquired by MicroSurvey (itself now under the Hexagon umbrella along with Leica Geosystems), and further development and upgrades are expected to continue. Software packages like these allow the surveyor to seamlessly introduce least squares network adjustments and corrections without upsetting workflow. Even more paramount is the need to ensure that a survey’s measurements, as they are about to be delivered to a client, are true and reliable. 

These packages have evolved into fully menu-driven tools with sophisticated user interfaces that allow users to edit input data, run adjustments, and view results, all from within the software. This is possible through multiple graphical windows that can be viewed simultaneously. Developers have consistently improved the software over the years to include a wide range of functionality that surveyors have requested. What’s more, such programs may be offered as standalone products that can easily fit between the surveyor’s field data collection and desktop mapping steps. Again, this has proven essential for optimal, uninterrupted workflow. Least squares software packages work well for nearly all kinds of surveys and prove ideal for the adjustment of traditional interconnected traverse networks.  

Least Squares in the Field

Peter Haas is a Canadian land surveyor with the Focus Corporation, in Victoria, British Columbia. Focus Corporation has approximately 20 office locations in Western Canada and performs different types of survey work, using STAR*NET for some or all of their work, including legal subdivisions, easements and rights-of-way, construction surveying, building layouts, and deformation monitoring work.  Focus uses STAR*NET for a lot of its work.

Surveyors use all kinds of measurement and data collection tools—total stations, GPS, etc.—as part of their process to check survey results. How reliable those results are depends on the checking method used. “If you make a check and find you have a slight misclosure in the survey,” said Haas, “you can accept it and say it’s good enough. Or you can apply least squares and some redundancy in the measurements and you’ll come up with a better answer.” 

Haas uses Spectra Precision Survey Pro data-collection software for conventional work and Trimble Access for GPS work. One key aspect of STAR*NET that has been a constant in all versions is easy integration of data. Data can be imported from numerous formats such as TDS, Carlson, TSC (Trimble), SMI, and a variety of levels including Topcon and Leica, thanks to STAR*NET’s data converters. 

This proved expedient for Haas when conducting a topographic survey for a power company involving a power line pole placement and design. Haas explains that it was a long linear survey of a stretch of road covering several miles. Part of the survey required using a reflectorless total station, while other parts were wide open and could be surveyed with RTK. In addition, the area contained published monuments that could be checked with GPS and the total station. “I was able to take my data files from the total station plus the vectors from the RTK survey and put them into STAR*NET to make sure all of the data was coming out in a consistent coordinate system,” Haas said.  

One of the more challenging surveying projects Focus took on involved deformation monitoring—the systematic measurement and tracking of the alteration in the shape or dimensions of an object as a result of stresses induced by applied loads. In this instance, Focus was asked to perform deformation monitoring on some buildings and on a barrier wall between the ocean and an upland area. A government entity on a military base in Victoria, B.C., was conducting a required environmental remediation on the site that had previously been used for various industrial purposes. The project necessitated removing all existing soil and backfilling the site with new soil. “But they were excavating below sea level, and so they built a barrier wall,” Haas said. “They wanted us to come in and monitor that wall for movement [caused by pressure of the sea pushing on the wall] to make sure the wall wasn’t going to collapse.”

Focus used the pre-analysis function in STAR*NET that analyzes the geometric strength of the network using the approximate layout and the instrument accuracies. “The pre-analysis function allowed us to test out how many measurements would be needed to get the required accuracy and number of observations for our survey,” Haas said. “This function was very helpful because it didn’t require any field work to get it. We went out and did the field work and started to process the actual field measurements,” Haas continued. “We found the measurements were very close to what the pre-analysis had told us.”

In deformation monitoring, surveyors take baseline measurements before any additional work is done, then return once a week or month to take another round of observations and compare these to the original measurements to check for any change in values. Using STAR*NET each time enabled Focus to get the best answers for its monitoring point and compare the coordinates to see if there had been any movement. “This all revealed there wasn’t much movement happening,” Haas noted. 

Least Squares in the Classroom

Martin Paquette, a professional land surveyor and an instructor at Renton Technical College in Renton, Washington, has practiced least squares network adjustment and used STAR*NET to do it for the last 20 years. Paquette finds several features compelling. One he cited is batch processing of raw data in text format (line by line). “It’s easy to read and spot errors and to archive,” Paquette noted. “You archive the raw data that is an extension of the field notes, really, and annotate it to explain what’s going on.”

Another feature Paquette likes in STAR*NET is that it provides a detailed report as a result of a survey adjustment. “It explains the adjustment, all the residuals (the apparent errors), and allows you to re-run the adjustment easily,” Paquette said. “That’s the nice thing about the software. You see an error and then correct it, then run the program again and it comes back to tell you your report looks fine [or not].”

Paquette noted that his commitment to the least squares concept as a professional surveyor carries over into the surveying classes he teaches, which is why every surveying student learns least squares adjustment before graduating from Renton Technical College. “If we can get students out the door who’ve actually been through the difficult part of the learning curve [with least squares], they can go into a job and immediately be familiar with it.”  Paquette often finds, when talking to graduates of his surveying class who now work at surveying firms, that in many instances those graduates are the first people at their firms to initiate the practice of least squares adjustment.

In reflecting on recent developments like STAR*NET’s Version 7, Paquette finds features like the differential leveling loop closure check integral to surveying projects, noting that it is a distinctive tool not offered in most least square software. “Here on campus, in particular, we have been able to tighten our elevations of all of our control points using least squares,” Paquette said. “We have better closures with it. Otherwise, without this capability, the errors tend to accumulate in unnatural places.”

Output results are clear and reliable. Paquette specifically calls out the software’s error ellipse function. “The two axes of the ellipse represent minimum and maximum components of the predicted errors at any given point and scale with the standard errors of the field techniques used,” Paquette said. “This represents the region of confidence—usually at the 95% level—that the position measured falls within that ellipse as part of the software’s network plotting visualization tool set.”

The value of using least squares network adjustment cannot be understated. After all, as surveyor and instructor Paquette points out, “The real key in least squares is that every measurement has a standard error even when figures close perfectly.” This can be frustrating, certainly, but it’s valuable to know so that errors can be corrected. Haas, of Focus Corporation, agrees. “Least squares takes all of your measurements and works out the statistics on them so you can identify blunders—large errors,” he said. “It gives you confidence in the values.”

If you are not using least squares for your surveys, here are some reasons to consider adopting it for your surveying work:

  • It is a mathematically correct method for checking the results of surveys of any size, especially traverses.
  • It is ideal for combining conventional, GPS, and leveling data into a control network and checking the validity of results.
  • It enables a surveyor to assign the appropriate weights to measurements based on the quality of a given measurement.
  • It allows flexibility during field data collection: field data can be collected in any order and configuration and from a wide range of formats and measurements—including total station measurements, GPS vectors/leveling data, etc.—even older measurements.
  • It helps identify blunders and errors.

Least squares network adjustment provides a final corrected network, complete with error ellipses.
In Paquette’s opinion, having least squares adjustment available and using it encourages surveyors to verify their network results correctly. “Least squares helps you catch the less obvious errors and also helps you catch the stuff that is outside of tolerance that you might have thought was okay,” Paquette said. “That’s where the real key [with using least squares] is. You can determine if the limits for these results are acceptable or not.”

Series Navigation<< “Protect and Serve” Your Business and Your FutureTransformation and Change: Interview with Trimble CEO and President Steve Berglund >>

Leave a Reply

Your email address will not be published. Required fields are marked *