Record ID | marc_university_of_toronto/uoft.marc:5423066546:2077 |
Source | University of Toronto |
Download Link | /show-records/marc_university_of_toronto/uoft.marc:5423066546:2077?format=raw |
LEADER: 02077nam 2200241 4500
001 AAIMR16172
005 20070119094004.5
008 070119s2006 onc|||||||||||||| ||eng d
020 $a9780494161722
039 $feh
100 1 $aWong, Joseph.
245 12 $aA neural-network approach to high-precision docking of autonomous vehicles.
260 $c2006.
300 $a111 leaves.
500 $aSource: Masters Abstracts International, Volume: 44-06, page: 2984.
502 $aThesis (M.A.Sc.)--University of Toronto, 2006.
506 $aElectronic version licensed for access by U. of T. users.
520 $aThe objective of this Thesis is to develop a neural-network-based guidance methodology for high-precision short-range localization of autonomous vehicles (i.e., docking). The novelty of the overall system is its applicability to cases that do not allow for the direct proximity measurement of the vehicle's pose.Herein, the line-of-sight based indirect proximity sensory feedback is used by the Neural-Network (NN) based guidance methodology for path-planning during the final stage of vehicle's motion (i.e., docking). The corrective motion commands generated by the NN model are used to reduce the systematic motion errors of the vehicle accumulated after a long-range of motions in an iterative manner, until the vehicle achieves its desired pose within random noise limits. The overall vehicle-docking methodology developed provides effective guidance that is independent of the sensing-system's calibration model. Comprehensive simulation and experimental studies have verified the proposed guidance methodology for high-precision vehicle docking.
590 $aROBARTS MICROTEXT copy on microfiche.$5CaOTU
653 $aEngineering, Mechanical.
856 41 $uhttp://link.library.utoronto.ca/eir/EIRdetail.cfm?Resources__ID=442092&T=F$yConnect to resource
949 $atheses masters$wALPHANUM$c1$i6063793-1001$j2$lMICROTEXT$mMEDIA_COMM$rN$sY$tMICROFORM$u2/2/2007
949 $aOnline resource 442092$wASIS$c1$i6063793-2001$lONLINE$mE_RESOURCE$rY$sY$tE_RESOURCE$u2/2/2007