Show simple item record

dc.contributor.authorBendelac, Shirien_US
dc.date.accessioned2018-06-23T08:00:16Z
dc.date.available2018-06-23T08:00:16Z
dc.date.issued2018-06-22en_US
dc.identifier.othervt_gsexam:15455en_US
dc.identifier.urihttp://hdl.handle.net/10919/83714
dc.description.abstractNeural networks are making headlines every day as the tool of the future, powering artificial intelligence programs and supporting technologies never seen before. However, the training of neural networks can take days or even weeks for bigger networks, and requires the use of super computers and GPUs in academia and industry in order to achieve state of the art results. This thesis discusses employing selective measures to determine when to backpropagate and forward propagate in order to reduce training time while maintaining classification performance. This thesis tests these new algorithms on the MNIST and CASIA datasets, and achieves successful results with both algorithms on the two datasets. The selective backpropagation algorithm shows a reduction of up to 93.3% of backpropagations completed, and the selective forward propagation algorithm shows a reduction of up to 72.90% in forward propagations and backpropagations completed compared to baseline runs of always forward propagating and backpropagating. This work also discusses employing the selective backpropagation algorithm on a modified dataset with disproportional under-representation of some classes compared to others.en_US
dc.format.mediumETDen_US
dc.publisherVirginia Techen_US
dc.rightsThis item is protected by copyright and/or related rights. Some uses of this item may be deemed fair and permitted by law even without permission from the rights holder(s), or the rights holder(s) may have licensed the work for use under certain conditions. For other uses you need to obtain permission from the rights holder(s).en_US
dc.subjectmachine learningen_US
dc.subjectneural networksen_US
dc.subjectconvolutional neural networksen_US
dc.subjectbackpropagationen_US
dc.subjectforward propagationen_US
dc.subjecttrainingen_US
dc.titleEnhanced Neural Network Training Using Selective Backpropagation and Forward Propagationen_US
dc.typeThesisen_US
dc.contributor.departmentElectrical and Computer Engineeringen_US
dc.description.degreeMaster of Scienceen_US
thesis.degree.nameMaster of Scienceen_US
thesis.degree.levelmastersen_US
thesis.degree.grantorVirginia Polytechnic Institute and State Universityen_US
thesis.degree.disciplineComputer Engineeringen_US
dc.contributor.committeechairErnst, Joseph M.en_US
dc.contributor.committeechairHuang, Jia-Binen_US
dc.contributor.committeememberWyatt, Chris L.en_US
dc.contributor.committeememberHeadley, William C.en_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record