Faster algorithms for matrix scaling and balancing via convex optimization

Thesis: S.M., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2017. === Cataloged from PDF version of thesis. === Includes bibliographical references (pages 61-65). === In this thesis, we study matrix scaling and balancing, which are fundamental prob...

Full description

Bibliographic Details
Main Author: Tsipras, Dimitrios
Other Authors: Aleksander Mądry.
Format: Others
Language:English
Published: Massachusetts Institute of Technology 2017
Subjects:
Online Access:http://hdl.handle.net/1721.1/112050
id ndltd-MIT-oai-dspace.mit.edu-1721.1-112050
record_format oai_dc
spelling ndltd-MIT-oai-dspace.mit.edu-1721.1-1120502019-05-02T16:22:03Z Faster algorithms for matrix scaling and balancing via convex optimization Tsipras, Dimitrios Aleksander Mądry. Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science. Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science. Electrical Engineering and Computer Science. Thesis: S.M., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2017. Cataloged from PDF version of thesis. Includes bibliographical references (pages 61-65). In this thesis, we study matrix scaling and balancing, which are fundamental problems in scientific computing, with a long line of work on them that dates back to the 1960s. We provide algorithms for both these problems that, ignoring logarithmic factors involving the dimension of the input matrix and the size of its entries, both run in time Õ(m log K log² (1/[epsilon])) where e is the amount of error we are willing to tolerate. Here, K represents the ratio between the largest and the smallest entries of the optimal scalings. This implies that our algorithms run in nearly-linear time whenever K is quasi-polynomial, which includes, in particular, the case of strictly positive matrices. We complement our results by providing a separate algorithm that uses an interior-point method and runs in time Õ(m³/²(log log K + log(1/[epsilon]))), which becomes Õ(m³/² log(1/[epsilon])) for the case of matrix balancing and the doubly-stochastic variant of matrix scaling. In order to establish these results, we develop a new second-order optimization framework that enables us to treat both problems in a unified and principled manner. This framework identifies a certain generalization of linear system solving which we can use to efficiently minimize a broad class of functions, which we call second-order robust. We then show that in the context of the specific functions capturing matrix scaling and balancing, we can leverage and generalize the work on Laplacian system solving to make the algorithms obtained via this framework very efficient. This thesis is based on joint work with Michael B. Cohen, Aleksandr Mądry, and Adrian Vladu. by Dimitrios Tsipras. S.M. 2017-10-30T15:29:18Z 2017-10-30T15:29:18Z 2017 2017 Thesis http://hdl.handle.net/1721.1/112050 1006508897 eng MIT theses are protected by copyright. They may be viewed, downloaded, or printed from this source but further reproduction or distribution in any format is prohibited without written permission. http://dspace.mit.edu/handle/1721.1/7582 77 pages application/pdf Massachusetts Institute of Technology
collection NDLTD
language English
format Others
sources NDLTD
topic Electrical Engineering and Computer Science.
spellingShingle Electrical Engineering and Computer Science.
Tsipras, Dimitrios
Faster algorithms for matrix scaling and balancing via convex optimization
description Thesis: S.M., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2017. === Cataloged from PDF version of thesis. === Includes bibliographical references (pages 61-65). === In this thesis, we study matrix scaling and balancing, which are fundamental problems in scientific computing, with a long line of work on them that dates back to the 1960s. We provide algorithms for both these problems that, ignoring logarithmic factors involving the dimension of the input matrix and the size of its entries, both run in time Õ(m log K log² (1/[epsilon])) where e is the amount of error we are willing to tolerate. Here, K represents the ratio between the largest and the smallest entries of the optimal scalings. This implies that our algorithms run in nearly-linear time whenever K is quasi-polynomial, which includes, in particular, the case of strictly positive matrices. We complement our results by providing a separate algorithm that uses an interior-point method and runs in time Õ(m³/²(log log K + log(1/[epsilon]))), which becomes Õ(m³/² log(1/[epsilon])) for the case of matrix balancing and the doubly-stochastic variant of matrix scaling. In order to establish these results, we develop a new second-order optimization framework that enables us to treat both problems in a unified and principled manner. This framework identifies a certain generalization of linear system solving which we can use to efficiently minimize a broad class of functions, which we call second-order robust. We then show that in the context of the specific functions capturing matrix scaling and balancing, we can leverage and generalize the work on Laplacian system solving to make the algorithms obtained via this framework very efficient. This thesis is based on joint work with Michael B. Cohen, Aleksandr Mądry, and Adrian Vladu. === by Dimitrios Tsipras. === S.M.
author2 Aleksander Mądry.
author_facet Aleksander Mądry.
Tsipras, Dimitrios
author Tsipras, Dimitrios
author_sort Tsipras, Dimitrios
title Faster algorithms for matrix scaling and balancing via convex optimization
title_short Faster algorithms for matrix scaling and balancing via convex optimization
title_full Faster algorithms for matrix scaling and balancing via convex optimization
title_fullStr Faster algorithms for matrix scaling and balancing via convex optimization
title_full_unstemmed Faster algorithms for matrix scaling and balancing via convex optimization
title_sort faster algorithms for matrix scaling and balancing via convex optimization
publisher Massachusetts Institute of Technology
publishDate 2017
url http://hdl.handle.net/1721.1/112050
work_keys_str_mv AT tsiprasdimitrios fasteralgorithmsformatrixscalingandbalancingviaconvexoptimization
_version_ 1719039204564926464