Flash Digest – News in Brief

Reports Flash Digest National Security

Decentralized but centrally watched? Government agencies race to regulate cryptocurrency exchanges.

On March 15, the SEC gave new signs of its policing efforts regarding cryptocurrency. Speaking at an investment conference, Stephanie Avakian, Co-Director of the SEC Enforcement Division, said that the agency is “very active, and [she] would just expect to see more and more.” The increased scrutiny targets not only issuers of Initial Coin Offerings (ICOs), but also the myriad of online trading platforms for cryptocurrencies that have recently emerged. Just a week prior to Avakian’s comment, the SEC published a statement warning operators of such platforms that they may need to register with the agency in one of several forms, notably as a national securities exchange or an alternative trading system (ATS).

Whether a financial instrument or value holder falls under the SEC’s jurisdiction depends on whether it is a “security.” Section 2(a)(36) of the Investment Company Act’s definition of “security” includes both specific instruments such as stocks and bonds as well as broader categories like “investment contract” or “participation in any profit-sharing agreement.” The SEC has repeatedly asserted that, depending on their actual characteristics, many cryptocurrencies are securities and thus subject to its purview, regardless of names like “coins” or “utility token.” Exchanges that facilitate trading of such crypto-securities must comply with SEC regulations in the same manner as conventional exchanges or an ATS.

Not all cryptocurrencies, however, are securities. The Commodity Futures Trading Commission (CFTC), the government watchdog that oversees derivatives markets, considers bitcoin and many other virtual currencies commodities under its regulatory power. A trading platform for these cryptocurrencies that allows leveraged or margin trades and net settlement must thus comply with CFTC rules. In 2016, the agency sanctioned one such exchange, Bitfinex, for multiple violations including the failure to register. An exchange might escape CFTC scrutiny if it makes “actual delivery” of the cryptocurrency to customer-controlled accounts within 28 days of settlement; however, there is not yet an official definition of “actual delivery” in the context of cryptocurrencies. Perhaps influenced by industry requests, the CFTC proposed a rule in December 2017 (pending public comments) to provide this exact definition.

Another active participant in regulating cryptocurrency exchanges is the Treasury Department’s Financial Crime Enforcement Network, or FinCEN. In 2013, the agency issued a special guidance for virtual currency that made any exchange that “(1) accepts and transmits a convertible virtual currency or (2) buys or sells convertible virtual currency for any reason” a money transmitter subject to FinCEN rules. In a letter to the Senate dated February 2018, the agency clarified and affirmed this position. As all these regulators start taking enforcement actions, it remains to be seen whether the different legal definitions of cryptocurrencies can stand individually without conflict.


Tech’s biggest merger attempt may give rise to a new takeover defense: national security.

Broadcom’s $117 billion hostile takeover attempt to acquire Qualcomm—tech’s largest ever proposed deal—ended on March 12, 2018, when President Trump signed an executive order blocking it. The President claimed authority under § 721 of the Defense Production Act to prohibit the takeover based on concerns of national security. The order was issued after the Committee on Foreign Investment in the United States (CFIUS)—a panel at the Treasury Department studying proposed deals for national security risks—concluded that this merger could “pose a serious risk” to national security.

At the center of CFIUS’ national security assessment is Qualcomm’s recent successes at developing the next generation of wireless communications, 5G. This technology is widely believed to be a potential game-changer due to vast improvements in speed and stability over the current standards. In recognizing the importance of 5G development, FCC Chairman Ajit Pai laid out plans in a February address to begin auctioning wave spectrum bands later this year, subject to Congressional approval. Another FCC plan would allow carriers to start building the infrastructure for 5G without being subject to environmental assessments. These actions, like the CFIUS recommendation, highlight fears that the US will lose to China and other powers in this race for a breakthrough technology.

Though it is not the first time a US President has blocked a merger citing this executive authority, the prohibition was still highly unusual from a legal standpoint. First, CFIUS was merely taking an interim action postponing the deal for 30 days, and it was still studying the details further when the President signed the order affirmatively ending all negotiations. The New York Times quoted investment lawyer John P. Kabealo calling this action “protectionist” and “activist.” Second, the CFIUS assessment and subsequent ban came before the deal was presented to Qualcomm’s shareholders for a vote. CFIUS only studied the transaction after Qualcomm filed a unilateral notice in January 2018 inviting its review, and made its decision despite a Broadcom’s public pledge to maintain R&D spending on 5G and increase training for American engineers.

Though it emerged victorious, Qualcomm’s management could face hurdles in winning back the approval of its shareholders, who seemed to lean towards a Broadcom merger in the proxy vote. However, US companies may have found a new strategy in fending off foreign acquisition attempts under the Trump administration. If national security can justify imposing tariffs on steel and aluminum and blocking mergers against the apparent will of the shareholders, what could be next?


Uber’s self-driving car killed a pedestrian, heating up debates on the legality of autonomous vehicles.

On March 18, a self-driving car tested by Uber in Tempe, Arizona, crashed into and killed 49-year-old Elaine Herzberg, who was walking a bicycle across the car lanes from the left side of the moving Uber car. Tempe police released video footages from interior- and exterior-facing cameras placed on the Uber vehicle. A safety driver sitting in behind the wheel was monitoring the car at the time, but neither she nor the software attempted to brake at the time of collision. While investigators have not yet concluded whether any party was responsible, an analyst on self-driving cars, Sam Ambuelsmaid, told the AP that it appears that “the car is at fault” because the sensor system should have recognized Herzberg’s actions and reacted appropriately.

This marks the second death in the US involving a computer-operated car. In May 2016, a 2015 Tesla Model S crashed into a truck on a Florida state highway, killing its owner. Federal investigators from the National Highway Traffic Safety Administration (NHTSA) later cleared the Tesla’s system, Autopilot, of fault, concluding that there was no system defect and the driver was not paying attention to the roads. These two deaths, however, will not help with the public’s reluctance to accept self-driving cars. In 2017, a Pew research report found that the majority of Americans still oppose the idea of sharing a road with computer-operated vehicles. The misgiving is particularly strong regarding autonomous cars that have no human in the driver’s seat; 87 percent of people favor requiring an in-car operator.

To this date, more than thirty states have introduced legislation on autonomous vehicles. Twenty-one states and the District of Columbia have passed laws. Governors in ten states have also issued executive orders aimed at the new technology. In February, California approved self-driving cars with no humans inside, though it still requires a remote operator to be able to take control. These legislative efforts will no doubt still leave gaps in assigning liability for self-driving cars when an accident occurs. Arizona regulators last year suggested that they would “take a back seat to the experts” before making rules, noting that they hoped that conventional mechanisms—such as human safety drivers and insurance companies—would suffice in apportioning civil liability at the moment. Criminal liability, with its usual requirement of a culpable state of mind or mens rea, presents an even tougher challenge. Should the human driver be held responsible when she pays insufficient attention to the roads, as in the Tesla accident? With artificial intelligence (AI) operating the car in ways not comprehensible even to the system’s creators, who, if anyone, is at fault?


Tue Tran is a 1L student at Harvard Law School.