JavaScript is disabled for your browser. Some features of this site may not work without it.

**Use of eCommons for rapid dissemination of COVID-19 research**

In order to maximize the discoverability of COVID-19 research, and to conform with repository best practices and the requirements of publishers and research funders, we provide special guidance for COVID-19 submissions.

## Techniques for Probabilistic Analysis and Randomness-Efficient Computation

#####
**Author**

Srinivasan, Aravind

#####
**Abstract**

Randomness is well-recognized as an important computational resource in theoretical computer science. Not only are there classical problems for which the only known "efficient" solutions are randomized, but there are problems for which randomness can be used to circumvent deterministic lower bounds. However, standard tools available for probabilistic analysis such as the Chernoff-Hoeffding bounds are not sufficiently powerful in all situations; second, since there are no known "true" sources of randomness, there is a need to develop general techniques for reducing/removing randomness from randomized algorithms. This thesis addresses these two issues. Certain types of scheduling and resource allocation problems in a distributed setting can be modeled as edge coloring problems. In Chapter 2, we present a fast and simple randomized algorithm for edge coloring a graph, in the synchronous distributed point-to-point model of computation. Our algorithm computes an edge-coloring of a graph $G$ with $n$ nodes and maximum degree $\Delta$ with at most $(1.6 + \epsilon) \Delta + O(\log^{2+\delta} n)$ colors with high probability, for any fixed $\epsilon, \delta, greater than 0$. To analyze our algorithm, we introduce techniques for proving upper bounds on the tails of random variables, extending the Chernoff-Hoeffding (CH) bounds to some types of dependent random variables. This is joint work with A. Panconesi [91]. In Chapter 3, we show that the CH bounds for the tails of the sums of bounded and independent random variables $X_{1}, \ldots, X_{n}$ require only limited independence among the $X_{i}$s. We show that if $X_{1}, \ldots, X_{n}$ lie in [0,1] and are $k$-wise independent, then $Pr(X \geq E[\sum_{i}X_{i}](1 + \delta))$ can be upper bounded by the CH bound, if $k \geq \mu \cdot min\{\delta,\delta^{2}\}$. This leads to algorithms that require a reduced amount of randomness for any analysis which uses the CH bounds. This is joint work with J.P. Schmidt and A. Siegel [108]. In Chapter 4, we present an $RNC^{2}$ algorithm for the perfect matching problem which uses $O(\log Z)$ random bits where $Z$ is any given upper bound on the number of perfect matchings in the graph, generalizing results of Grigoriev and Karpinski. Underlying our algorithm is a randomness-optimal generalization of the Isolating Lemma of Mulmuley, Vazirani and Vazirani, which also leads to other applications. This is joint work with S. Chari and P. Rohatgi [26].

#####
**Date Issued**

1993-08#####
**Publisher**

Cornell University

#####
**Subject**

computer science; technical report

#####
**Previously Published As**

http://techreports.library.cornell.edu:8081/Dienst/UI/1.0/Display/cul.cs/TR93-1378

#####
**Type**

technical report