Browsing by Author "Johnson, Steven D."
Now showing 1 - 13 of 13
Results Per Page
Sort Options
- Analysis of technological change and relief representation in U.S.G.S. topographic mapsMahoney, Patricia (Virginia Tech, 1991-01-15)In 1882, the United States Geological Survey began its National Mappping Program designed to map the nation using a series of several thousand topographic quadrangles. Since that date, the program and the maps themselves have undergone many changes due mainly to technological advances in mapping methods. The use of data collected from historic U.S.G.S. topographic maps in modem day applications necessitates a general knowledge of the potentials and limitations of these data. This study compares representations of terrain features on historic maps compiled using plane table methods with the same features as represented on more accurate modem maps compiled using photogrammetry. Using the modem map as a standard, errors in the old maps were identified and defined using statistical procedures. Measures of closed contour lines recorded the angularity of the line, the length of the line, the area within the contour, the shape of the feature and spatial relationships between contour pairs. The analysis attempts to relate errors to these geometric components of contour lines and to predict the occurrence of error. Due to practices of smoothing and generalization of contour lines in plane table surveys, measures of both angularity and shape were significantly different between older and newer maps. Systematic errors, a consistent displacement of contour lines in a similar direction, were also identified on the historic maps. Based on these results, several suggestions for continuation of the research are given.
- The Automated Laser Position System - ALPSLundberg, Eric J. (Virginia Tech, 1989-09-15)The construction industry needs an accurate real-time positioning system. Such a system, if successfully implemented, would lead to significant increases in the performance of many construction operations. This thesis presents the Automated Laser Position System (ALPS) for accurate real-time positioning. ALPS is a spin-off of the Automated Position And Control System (APAC) research, sponsored by the National Science Foundation under grant DMC-8717476. The ALPS concept has three primary components: a rotation laser, laser detectors and a central processing unit. ALPS generates both horizontal (X,Y) and vertical (Z) position information. It is mathematically predicted that ALPS could produce accuracies of ± 17 mm in the horizontal and ± 5,9 mm in the vertical, at a range of 400 m. Position measurements would be updated 50 times a second.
- Computer software to calculate the systematic coordinate differences between two geodetic datumsBesecky, Edward Joseph (Virginia Tech, 1990-12-03)The high degree of accuracy now found using GPS observation techniques has led to worldwide acceptance of the geocentric datums, specifically the WGS84 datum as the mainstay for referencing in the geodetic community. Nevertheless, local datums are non-geocentric and if we want to use GPS on their positions, some disagreement will result. This report presents PC-based software to transform coordinates between any two arbitrary datums. Transformations between NAD27 and NAD83 are used as examples expanded with the development of maps which illustrate shifts between those two datums in Latitude, Longitude, and Geoidal Height. It should be stressed that these transformations are based upon the standard seven parameters (3 shifts, 3 rotations, and scale change) and changes in the semimajor axis and the flattening including second partial differentials. This software does not take into account any random distortions that may be present in the datum coordinates.
- Considerations for implementing a microcomputer database for Virginia control survey dataGoldsmith, Ted G. (Virginia Tech, 1987-09-21)Currently, geodetic control data are generally available only at the federal level. The National Geodetic Survey (NGS) publishes only the results of those surveys that their agency performs, as well as other surveys that meet certain criteria.Currently, geodetic control data are generally available only au: the federal level. The National Geodetic Survey (NGS) publishes only the results of those surveys that their agency performs, as well as other surveys that meet certain criteria. The advantages of a state office which would disseminate NGS control data as well as all other survey data within the state are discussed. Since NGS now distributes its data on floppy diskettes, the design of a microcomputer database to access this information is investigated. This thesis focuses on such a database system operated mainly by the end-user of control data. An integrated software system, in which. related computational programs are linked to the database, is also considered. The dBase III package, as one software alternative, is examined. Applicability to geographic information systems is also explored. Data formats, file sizes, and administrative concerns are dealt with, and future research and development in these and related areas are proposed.
- Determination of the end of functional service life for concrete bridge componentsFitch, Michael G. (Virginia Tech, 1993-04-26)The transportation engineering community of the United States faces a tremendous problem: the gradual deterioration of the nation's bridges. A major component of the overall bridge deterioration problem is the corrosion-induced deterioration of reinforced concrete bridge components that are exposed to de-icing salts. The progression of events resulting from corrosion of the reinforcing steel includes cracking, delamination, spalling, and patching of the surface concrete. Bridge components reach the end of their functional service life when the level of damage warrants rehabilitation. The objective of this study was to determine the end of functional service life for concrete bridge decks, piers, and abutments by quantifying terminal levels of physical damage. The approach for quantifying terminal damage levels involved obtaining recommendations from state Department of Transportation (DOT) bridge engineers via an opinion survey. A field study of 18 existing concrete bridges that had been designated for rehabilitation was conducted to develop concrete bridge component maps showing areas of physical damage. Deck damage maps were produced using a ground-based photogrammetry system developed in this study, while pier and abutment damage maps were drawn by hand in the field. Survey Kits based on the component damage maps were distributed to bridge engineers in 25 states that use de-icing salts. The engineers evaluated the maps and recommended when each component should be, or should have been, rehabilitated~ Based on the engineers' responses, linear regression prediction models were developed to relate the recommended bridge component rehabilitation time point to the physical damage level. Based on the prediction models, two viable terminal damage levels for concrete bridge decks, and a partial terminal damage level for concrete bridge piers, were quantified.
- The effect of thresholding a maximum likelihood classifier on the accuracy of a landsat classification of a forested wetlandAgnello, Jennie M. (Virginia Tech, 1987-08-05)Although the maximum likelihood classifier is a popular classification technique, there is an inherent problem associated with the 100% classification of a scene. This is because there will inevitably be pixels within a study area that have a low probability of belonging to any of the predefined categories. The focus of this research was to locate these low probability pixels and observe their affect on classification accuracy. This was done by performing supervised classifications at various threshold levels using two methods of classification training combined category training site statistics and separated category training site statistics. In general, it was found that a majority of the scene was classified at very low probabilities but the accuracy of the resulting classifications was much greater than the low probabilities would suggest.
- Effects of grid lattice geometry on digital image filteringBrown, Roger Owen (Virginia Tech, 1989-04-05)The spatial distribution of discrete sample points from an image affect digital image manipulation. The geometries of the grid lattice and edge are described for digital images. Edge detecting digital filters are considered for segmenting an image. A comparison is developed between digital filters for two different digital image grid lattice geometries — 8-neighbor grid lattice (rectangular tesselation) and the 6-neighbor grid lattice (hexagonal tesselation). Digital filters for discrete images are developed that are best approximations to the Laplacian operator applied to continuous two- dimensional mathematical surfaces. Discrepancies between the calculated Laplacian and the digital filtering results are analyzed and a criterion is developed that compares grid lattice effects. The criterion shows that digital filtering in a 6-neighbor grid lattice is preferable to digital filtering in an 8-neighbor grid lattice.
- An empirical study of the fidelity of organziational accounting communication and the impact of organizational cultureJohnson, Steven D. (Virginia Tech, 1991-06-04)Communication and culture both play essential roles in organizations. The effective communication of accounting information is required to coordinate business operations and move the organization toward the accomplishment strategic goals. Without effective communication, the most sophisticated analyses and crucial reports will fail to generate appropriate decisions and actions. Culture is a symbolic system of values that helps the members of an organization explain, coordinate, and evaluate behavior and to ascribe common meanings to events and symbols encountered in the organization. Organizations confine the experience and interaction of its members into structured and recurring patterns. As organization members interact, shared meaning for issues of common interest evolve. A technical organizational language develops whose symbols have definite and common meaning. If the culture of organizations or subcultures within an organization are different, dissimilar meanings could be ascribed to the management accounting terms (symbols) used to communicate accounting information. Dissimilar meanings could inhibit the fidelity of accounting communication within and between organizations and organization subunits.
- Evaluation of photographic properties for area estimationWiles, Steven Jay (Virginia Tech, 1988-05-15)From the known image positional errors on aerial photographs, this thesis computes and evaluates acreage estimation errors. Four hypothetical tracts were used in simulating aerial photographs with 104 different camera orientation combinations. Flying heights of 4000 and 6000 feet, focal lengths of 24 and 50 millimeters with and without lens distortion, and tilts of 0, 3, 6, and 12 degrees were simulated. The 416 photographs were all simulated with the camera exposure station centered above the midpoint of the respective tract's bounding rectangle. The topographic relief of the tracts ranged from 19 feet in the Coastal Plain to 105 feet in the Piedmont. It was found that lens focal length did not have an independent effect on the acreage estimates. Relief error, the lowest, averaged -0.080%. In comparison, small errors in calculating scale were shown to be larger than relief errors. Tilt was recommended to be limited to six degrees, averaging +1.6% error at six degrees tilt. Because of its positive exponential nature when the tracts are centered, tilt can induce large biases. including tilts from zero to six degrees,the average was 0.634%. Lens distortion error averaged -0.686%. Overall, the average acreage error was 0.363% for simulations up to and including six degrees of tilt with and without lens distortion. This result is for centered tracts, and it was felt many of the errors were compensating given this situation. In conclusion, the photographic images can estimate areas to $1%, however, additional errors are imparted during actual measurement of the photographs.
- A geometric approach to determination of satellite ephemeris over a limited areaThackrey, Keith R. (Virginia Tech, 1988-12-15)Range and,interferometric observations have been examined for their potential, application in a geometric approach to determination of satellite ephemeri. The approach differs from the normal (dynamic) approach in that each satellite position is treated as an independent state variable or benchmark. Programs have been developed that simulate and format the input, data for the least squares estimation routines, and perform statistical analyses of those results. Random error, tropospheric refraction errors, and atomic clock errors have been considered, and the range observation adjustment program directed to solve for clock errors.
- A new perspective for creating geographic products for drug interdictionCapps, Penny R. (Virginia Tech, 1992-12-17)According to former Federal Bureau of Investigation Director, William Webster, "The only way any coordinated efforts can succeed in combating drug trafficking, organized crime, terrorism, or even bank robberies is through the timely and candid exchange of intelligence data on criminal activities." This paper proposes an approach to creating geographic products that could be used in fighting the war on illegal drug trafficking. Products such as likely trafficking routes, border crossing observation points, and potential processing plant locations represent combinations of geographic and intelligence data which would be assets in drug interdiction. The approach described in this paper is significant in that it addresses the four primary phases of drug interdiction: cultivation, processing, smuggling, and distribution. The digital data sets and computer technologies required to create the products are discussed including topics such as fusion by common coordinates of imagery and feature data, and dynamic segmentation using textual and feature information. This approach towards managing intelligence and environmental data and creating products for drug interdiction will help maximize the effectiveness of law enforcement assets.
- The Sequential Givens method for adjustment computations in photogrammetryJohnson, Theodore David (Virginia Tech, 1988-05-05)The Givens orthogonalization algorithm is an efficient alternative to the normal equations method for solving many adjustment problems in photogrammetry. The Givens method is one of a class of methods for solving linear systems known generally as orthogonalization or QR methods. It allows for sequential processing and greatly simplifies the computation of statistics on the observations and residuals. The underlying reason for these advantages is the immediate availability of the orthogonal Q matrix, which is computed as the data are processed and is intimately related to the statistics needed for blunder detection. One of these statistics, the F statistic computed from externally studentized residuals, is both easily obtained and well-suited for blunder detection. The Givens method requires nearly four times the number of computations as compared to the normal equations approach in order to reach a solution. However, depending on the size of the problem, blunder detection through the normal equations requires far more computer time than is required when starting with a Givens decomposition. The method allows a user to review intermediate results, test residuals and modify the solution without having to compute a full solution. Adjustments of a level net and a single-photo resection are used to demonstrate the method. Because of the advantage in computational time, the Givens method is superior to the normal equations approach when rigorous blunder detection is required.
- Study of photogrammetric self-calibration adjustment methodLong, Barrington (Virginia Tech, 1990-05-05)The development of a viable self-calibration approach for use with non-metric cameras was investigated. Both computer generated and actual test camera data were generated to determine the effectiveness of the math model and computer program. A twenty-seven parameter bundle adjustment routine was proposed because of its versatility and compatible use in an existing aerotriangulation package. For the camera and test configuration considered, the focal length was recovered to within two percent, and the principal point location was recovered to wi thin O. 3 to twelve percent. When the computer generated data was used, the focal length and principal point offset were recovered to within 0.2 percent. Modeling and software has been made available for a future comparative study between the self-calibration and Direct Linear Transformation adjustment parameters. The self-calibration modeling and former Direct Linear Transformation modeling software is a promising tool for mensuration tests and experiments with video and Charge Coupled Device (CCD) imagery.