Back

 Industry News Details

 
Interview with Sasha Javid, Chief Data Officer, FCC - Speaker at 5th Annual Global Big Data Conf Posted on : Jul 25 - 2017

We feature speakers at 5th Annual Global Big Data Conference - August 2017 to catch up and find out what he or she is working on now and what's coming next. This week we're talking to Sasha Javid, Chief Data Officer,  FCC (Topic : Using OR And Big Data In The FCC's Incentive Auction)

Interview with  Sasha Javid

1. Tell us about yourself and your background.
I am the Chief Data Officer and Legal Advisor for the Incentive Auction Task Force at the FCC.  I have worked in the wireless space for almost 20 years in various capacities, including as government regulator, entrepreneur and venture capitalist.

2. What have you been working on recently?
The Incentive Auction Task Force was assigned the responsibility of implementing the first-in-the-world two-way spectrum auction.  The goal of this so called “incentive auction” was to repurpose a portion of the broadcast television spectrum (i.e., the 600 MHz band) for mobile broadband to support the exploding growth of mobile traffic.  As Chief Data Officer, my responsibility was to ensure that the Commission could make data-driven decisions in support of the auction and the post-auction transition.This auction recently concluded generating approximately $20 billion in proceeds from mobile broadband carriers, of which $10 billion was paid to broadcast television stations in exchange for relinquishing their spectrum.

3. Tell me about the right tools you used recently to solve challenges you faced in implementing this auction?
Given the novelty of this auction, there were many unique challenges that the FCC faced that required various data tools.   To assist world-renown auction designers, FCC developed an auction simulation platform which leveraged customized sat solvers to determine the whether a given set of stations could be assigned to a given set of channels while satisfying millions of constraints.  To determine what amount of spectrum to clear given a set of participating stations and to determine both the final channel and transition phase assignments given a set of remaining stations, the FCC used optimization software.  Finally, to turn millions of points of interference into constraints, the FCC used a variety of databases, ETL software, analytics packages and visualization tools. 

4. Where are we now today in terms of the Big data, and where do you think we’ll go over the next five years?
The effective use of Big Data has at the FCC, like at most government agencies, has become critical to its success.  Projects like the incentive auction could not have done without the ability to process millions of rows of TV-to-TV and TV-to-Mobile Broadband interference data and turn them into constraints that could be used by the sat solvers and optimization software.  Moreover, without effective GIS mapping tools and data visualization tools to help make sense of this data, it is likely the FCCwould not have made the right policy decisions in support of this auction. Moving forward, I expect the reliance on Big Data tools will grow as more data that the FCC collects is moved off legacy systems and into the cloud.

5. You’ve already hired Y number of people approximately.What would be your pitch to folks out there to join your Organization? Why does your organization matter in the world?
What I love about working at the FCC is that your work matters.  The incentive auction touched on so many segments of our information economy, from mobile broadband providers to broadcast television stations to unlicensed wireless devices.  The auction proved that market forces could be used to free up spectrum for higher value uses such as mobile broadband which is critical to the continued technology leadership of the United States.

6. What are some of the best takeaways that the attendees can have from your workshop on "Using OR And Big Data in The FCC's Incentive Auction"?
The combination of OR and Big Data in government can facilitate all types of interesting projects previously thought not possible.  Leveraging cloud computing and a combination of custom and off-the-shelf sat solvers, OR software and Big Data tools, the FCC was able to conduct a multi-stage and multi-round auction that required (a) the real-time repacking of the nation’s broadcasters hundreds of times throughout the auction, (b) the determination of a new band plan including license impairment calculationsat each stage of the auction and ultimately (c) a transition and reassignment schedule for remaining television stations.

7. What are the top 5 Big data Use cases in government?
Resource allocation through optimization, business intelligence in support of rulemakings, managing public complaints/feedback, predictive analytics and cybersecurity…. just to name a few.

8. Which company do you think is winning the global Big Data race?
I will avoid this one given my role in government, but clearly there are several great Big Data companies which we use all the time at the FCC.

9. Any closing?
Most importantly, that government can still do big things.  The complexity of this auction was immense and has often been compared to solving a Rubik’s Cube.  The Task Force pulled together leading auction designers, engineers and OR expertsfrom both industry and internally to ensure that everything worked together perfectly for the auction and the subsequent post-auction transition planning.