"Antenna Man" rakes WGBH over the coals
Garrett Wollman
wollman@bimajority.org
Wed Jul 15 23:02:27 EDT 2020
<<On Wed, 15 Jul 2020 20:26:27 -0400, Scott Fybush <scott@fybush.com> said:
> The repack process didn't care much about original analog channels, and had
> no reason to.
To maybe make this a bit clearer -- and put it in terms that will make
sense to John and others who work in software -- we can talk a bit
about how the spectrum auction worked.
In each phase of the "incentive auction", the FCC started by
determining an amount of spectrum they wanted to clear. They then ran
a "forward auction" for the wireless carriers to bid on spectrum,
which gave them a revenue figure.
Then, multiple rounds of "reverse auction" were run, in which TV
licensees were given individualized offers to go dark, move to
VHF-high, or move to VHF-low as appropriate.
The stations that refused the FCC's offer in any round were kicked out
of the auction for the rest of the phase, and the offers to the
remaining stations were *reduced* in value with each round until only
the minimum number of stations had accepted an offer. Stations that
had accepted an offer in any round were permanently bound by their
last bid. Thus, the auction was designed to ensure that stations
remained in the auction.
For each round of the reverse auction, a satisfiability solver was run
to ensure that there was *some* reassignment of spectrum that would
give every remaining station a reasonable approximation of coverage of
their market and would respect the agreements the FCC made with the
Mexican and Canadian authorities, and that there was a feasible
transition that required no more than two repack phases in each
broadcast TV market. Satisfiability, or "SAT", is an NP-complete
problem, which means that while it's easy to test whether any given
solution actually works, there is no efficient way to compute such a
solution in the first place. SAT algorithms are worst-case
exponential in the number of terms, and the problem here --
reassigning spectrum throughout the entire US, the populated parts of
Canada, and the border states of Mexico -- has millions of terms.
Most SAT solvers use randomized algorithms to avoid the worst-case
behavior (given some assumptions), so each subsequent run will result
in a *different* assignment.
This process was repeated in each subsequent phase, with the target
amount of cleared spectrum *reduced* -- thereby increasing the
wireless carriers' bids and decreasing the number of broadcast
licensees the government was looking to buy out -- until the total of
all the winning bids in the forward auction, minus the total of all
the winning bids in the reverse auction, minus the estimated
construction costs to be paid to broadcast licensees for changing
channels, was greater than the Congressionally mandated "profit" the
FCC was required to collect for the Treasury.
Once the auction results were locked in, another solver run was
initiated to compute the final set of channel assignments and
transition phases for each market, along with those big lists of
stations that all had to change at exactly the same time.
The whole thing -- the "incentive auction" structure and all of the
various pieces, comes out of a crossover field of economics and
computer science called "mechanism design". (I learned about it by
attending a talk at work by one of the theoreticians who had helped to
figure out how to do the computational part efficiently.)
-GAWollman
More information about the Boston-Radio-Interest
mailing list