Did 2020 Break the Internet? - Lawfare - Lawfare - Lawfare

Did 2020 Break the Internet? - Lawfare - Lawfare - Lawfare


Did 2020 Break the Internet? - Lawfare - Lawfare - Lawfare

Posted: 09 Jun 2021 07:30 AM PDT

The coronavirus pandemic led to the abrupt transition of all aspects of life in cyberspace. Beginning in early March 2020, students adapted to remote learning, white-collar professionals adjusted to indefinite work-from-home policies, doctors conducted telehealth appointments, online workouts soared in popularity, and television binge-watching skyrocketed. Internet traffic surged to unprecedented levels almost overnight, and this level would not change. The coronavirus crisis indisputably served as the internet's biggest stress test to date. Yet network operators and consumers dispute how well the internet performed. At the end of the day, this disagreement boils down to the wrong framing of what defines "the internet."

The Initial Surge of Internet Traffic

"Internet traffic" refers to the amount of data passing through networks, and the volume of traffic is generally measured in bits. Regarding performance, speed matters. There are two ways to measure internet speed—latency and bandwidth. Latency measures how long it takes a bit to travel from one place to another and is therefore measured in units of time, such as milliseconds (1/1,000 of a second). Bandwidth measures how many bits can get from one place to another in a given period and is therefore measured in bits per second. Packet loss, another important metric for internet reliability, measures how many bits reach their intended destination. Most discussions of internet performance focus on bandwidth, but latency and packet loss are also important indicators of user experience.

Internet traffic as a whole, however, is difficult to measure. The internet consists of many interconnected networks as well as a three-level hierarchy of service providers. Tier 1 networks, such as the internet giants AT&T, Level 3, and Sprint, run the internet's "backbone," which is composed primarily of ultrafast fiber optic cables. Tier 1 providers then interface with Tier 2 internet service providers (ISPs), such as Comcast or Cox Communications, that handle regional traffic. Tier 3, or "last mile," providers then deliver the internet to a customer's home or office. Tier 2 and 3 networks are generally privately owned. Therefore, aggregated data from private companies, such as ISPs, content delivery networks (CDNs), and edge providers, offer the clearest insight into general internet traffic even if one company does not capture the full picture.

Akamai Technologies, one of the largest CDNs, reported data showing a significant rise in traffic during March 2020. CDNs distribute content, such as popular videos, files, or HTML code, by keeping a copy (called a cache) closer to the end user. Caches enable CDNs to improve website performance and load times. On April 6, 2020, Akamai's CEO Tom Leighton wrote, "From our vantage point, we can see that global internet traffic increased by about 30% during the past month." He explained, "That's about 10x normal, and it means we've seen an entire year's worth of growth in Internet traffic in just the past few weeks." While informative, this data is specific to Akamai and therefore might not represent traffic patterns as a whole. However, this measurement can be used as a floor, and internet traffic likely increased by at least this much and probably by much more.

Other companies reported similar trends of escalating internet traffic. The Internet & Television Association described how cable companies witnessed "a 19 percent increase in peak downstream traffic and a 33 percent increase in peak upstream traffic" during March 2020 alone. Similarly, wireless internet service providers, which primarily serve rural parts of the United States, reported a 36 percent increase in average traffic as of March 23. On March 30, network management software company ASSIA reported that total Wi-Fi upload traffic had increased by 80 percent. Upload traffic served as a particularly important indicator during the pandemic since many of the new online activities require much stronger upload speeds than traditional internet activities. These new virtual activities, such as teleconferencing or online teaching, are data-intensive activities that require symmetrical, robust internet speeds unlike checking e-mail or watching a YouTube video, which rely primarily on download speeds. 

Changes in Where People Got Online: Residential vs. Business Internet

The change in internet consumption coincided with a geographic shift in where people connected. Rather than connecting from public, educational, or business locations, which have more robust enterprise connections, people began connecting from home. Businesses generally invest in higher bandwidth services, known as enterprise or business internet. Describing the difference between residential and business internet services, Verizon states, "Residential internet often has restricted upload speeds and comes with only best-effort service agreements, while business internet demands faster upload speeds in order to perform operations. In addition, ISPs provide guaranteed service and uptimes for business internet." In other words, residential internet focuses on high download speeds while throttling upload speeds, whereas business internet generally offers symmetrical download and upload speeds. Business internet is also generally two to five times faster than residential connections and can offer a dedicated line so that a business does not need to share its connection with neighbors. Residential internet, by contrast, struggles with significant congestion. 

Congestion in residential (Tier 3) networks occurs for a variety of reasons, such as outdated infrastructure or oversubscribed neighborhood nodes. While fiber optic cables largely make up the backbone (Tier 1) networks, only 10 percent of Americans had "fiber to the home" as of 2017. BroadbandNow's recent inquiry found an increase in this proportion, with 32 percent of Americans having fiber coverage. The cable and phone companies that provide last-mile home broadband offer primarily outdated cable or copper networks that can develop bottlenecks at network nodes where multiple lines converge. An oversubscribed neighborhood node results in households competing for bandwidth. In addition, cable's shared network nodes were originally built to deliver entertainment and therefore vastly prioritize download over upload capacity.

Changes in Peak Times

The pandemic changed the way people used the Internet. Whereas before, internet traffic dropped on the weekends, during the pandemic ISPs and CDNs observed only a minor dip if any at all. Akamai noted that "weekends have ceased, at least from a traffic perspective." Independent researchers observed the same temporal shift with workdays progressing to weekend-like activities. This is because customers used the internet for business and education during the week and then transitioned to streaming videos and gaming during the weekends. In addition, rather than meeting friends and family in person on the weekends, which would result in lower weekend internet usage, people resorted to virtual meetings during the pandemic. 

Sustained High-Water Marks

After the initial surge in March, American internet traffic did not return to pre-coronavirus levels. Instead, internet traffic sustained the high-water marks throughout 2020. Internet traffic typically increases from month to month and year to year. For instance, Akamai generally sees 3 percent growth in internet usage each month. However, Akamai saw an unprecedented increase in internet usage that extended beyond just March 2020. Similarly, Cloudflare saw a significant leveling off of internet traffic from April to December 2020, demonstrating the "new normal" of internet traffic (see Figure 1). The question then remains at what rate internet traffic will continue to increase moving forward. Akamai CEO Tom Leighton predicts "traffic [will] continue to grow in 2021 but at a rate more in line with pre-2020 historical levels." 

Chart, line chart Description automatically generated

Figure 1. Cloudflare's analysis of U.S. internet traffic distribution in 2020.

Two Competing Narratives: Resilient Network vs. Service Degradation

The year 2020 undeniably presented the biggest stress test for the internet to date. However, there is no clear consensus on how it performed. Instead, two competing narratives have emerged. Network engineers and operators argue that the internet ultimately prevailed when confronted with a sudden increase in traffic. End users, cities, and edge providers, by contrast, argue that the internet ultimately buckled, resulting in a serious degradation of quality. In fact, the data suggests that both answers are justified. The internet backbone continued to function, and packets were consistently delivered to their destinations. However, users still experienced significant degradation from congestion of last-mile networks as well as strained capacity of edge providers. 

First, let's examine the narrative regarding the internet's resiliency. Network engineers assert that the internet passed with flying colors when tested with a sudden surge of traffic. Matthew Prince, the co-founder and CEO of Cloudflare, explained that the internet can survive a few hours of Super Bowl traffic, so it can handle a sustained spike "for four weeks or four months or however long this heightened period of time happens." He stated that the internet does not simply "wear out" by increased use like a car. Backbone service providers also echoed that confidence and reassured customers that their networks were prepared. On March 21, Verizon stated, "Verizon's fiber optic and wireless networks have been able to meet the shifting demands of customers and continue to perform well." Kyle Malady, Verizon's CTO, reiterated Matthew Prince's confidence in the network that "Verizon operates its networks every day as though it's a snow day."

The data on backbone providers' performance largely supports this narrative, as shown by analysis of Tier 1 carriers' packet loss and latency. The U.S. company Noction probed Tier 1 carriers' networks and found a slight uptick in packet loss and latency values mid-March. After mid-March, packet loss and latency largely leveled off.

However, ThousandEyes, a network intelligence company, probed Tier 1, Tier 2, and Tier 3 carriers and found a significant increase in network outages during March 2020. For North American ISPs, there were 65 percent more outages in March 2020 as compared to January 2020. However, the ThousandEyes report concluded that the network disruptions were due not to network congestion but instead to network maintenance. To support the network maintenance theory, ThousandEyes pointed to the fact that performance indicators, such as packet loss and latency, remained within tolerable ranges from March to June, demonstrating there was no systemic duress. Instead, network disruptions occurred largely outside of business hours. Yet taken together with Noction's analysis that there was a slight uptick in packet loss and latency in mid-March, the increased outages suggest that network operators were beginning to see the signs of network stress and upgraded accordingly.

Not everyone bought the ISPs' triumphant narrative. Doug Dawson, a broadband industry expert, expressed his skepticism: "By 'handling' the volumes [the ISPs] mean that their networks are not crashing and shutting down. But I think there is a whole lot more to these headlines than what they are telling the public." An examination of sluggish upload speeds as well as internet performance metrics at the city and neighborhood levels revealed significant degradation during the pandemic.

Household upload connections became major choke points for traffic with the increase in two-way video conversations. Known as the "upload crisis," the upload path becomes susceptible to overload, especially when transmitting over the coaxial cable and telephone DSL networks that most neighborhoods rely on. ISPs have the power to configure the balance of download and upload traffic. However, the current Data Over Cable Service Interface Specification (DOCSIS) standard, which enables internet access over existing coaxial cables, limits the upload traffic to no more than 10 percent of the total bandwidth.

Numerous anecdotes demonstrate the upload crisis as well as residents competing with business internet service, as shown by reddit user jgoodm of Los Angeles. In April 2020, jgoodm asked how to improve his internet speeds. He explained, "My direct neighbor just installed Spectrum Business class at his house and he is a film editor. He is moving terabytes of data all morning editing a movie. As soon as he starts work in the morning, my ping times and latency get terrible, my VoIP becomes unusable and my kids online schooling becomes too choppy to actually use." Internet bandwidth becomes a zero-sum game where enterprise services, which are guaranteed, ultimately throttle residential services. Unfortunately, similar node congestion in neighborhoods and apartment complexes has been an all too frequent experience during the work-from-home era.

An examination of city and neighborhood internet performance further points to strained networks. Of the top 200 U.S. cities, 88 cities (44 percent) experienced some level of network degradation. Three cities (Austin, Texas; Winston-Salem, North Carolina; and Oxnard, California) deviated from their 10-week median download speed range by more than 40 percent. In addition, the number of counties that failed to meet the Federal Communications Commission's minimum standard for broadband connectivity rose from 1,708 counties (52.8 percent) in February 2020 to 2,012 counties (62.2 percent) in March 2020. This city- and county-level data reveal that the national trends do not tell the full story of the internet's resilience. Households endured significantly different internet degradation depending on their particular neighborhood or city. While 2020 didn't break the Tier 1 carriers, there is a strong indication that it exposed the cracks of the internet at the regional and neighborhood levels.

Defining "the Internet": A Classic Category-Mistake

What does it mean for "the internet" to break? While the term "the internet" is now thrown around widely to refer to a global platform for information and communication, it is useful to ground the term in its original definition. In 1974, internet founders Vinton Cerf and Robert Kahn published "A Protocol for Packet Network Interconnection." While the paper did not explicitly use the term "internet," it described "internetwork" protocols (TCP/IP) that would enable networks to communicate with one another. In 1995, the Federal Networking Council (FNC) unanimously passed a resolution providing the first official definition of the internet. In the inaugural definition, the FNC described the internet as "the global information system that … is able to support communications using the Transmission Control Protocol/Internet Protocol (TCP/IP) suite or its subsequent extensions/follow-ons, and/or other IP-compatible protocols." The seminal Cerf-Kahn paper and the FNC definition underscore the important fact that the internet itself is not a network. Instead, the internet is the protocols and rules that allow multiple networks to communicate with each other.

Since the internet is the rules that enable other networks to communicate, it represents a larger conceptual framework than each individual network on its own. Instead, conflating the internet with an individual network is a classic example of a "category-mistake." Philosopher Gilbert Ryle introduced the concept of a category-mistake in 1949 through a simple story. Ryle told the tale of a tourist visiting Oxford. After viewing the various colleges, libraries, museums, and administrative offices, the tourist asked, "But where is the University?" As Ryle explained, "the University'' is simply the way in which all of the buildings the tourist had already seen are organized. In essence, a category-mistake is a logical error.

Similarly, the concept of "the internet" is the larger category of how all of the various networks connect and communicate. It is the overall aggregation of connected networks—whether it be a home network, business network, or university network—rather than any one individual network. The internet cannot fail unless all of the individual networks fail, which is a highly unlikely scenario. Therefore, the network operators' narrative that the internet didn't collapse when faced with surging traffic during the coronavirus pandemic may be true, but it also hides the ball. This triumphant conclusion obscures the degradation experienced within individual and regional networks

SCOTUS Limits the Reach of the Computer Fraud and Abuse Act, with Implications for Cybersecurity, Trade Secrets Litigation, and Beyond - Lexology

Posted: 09 Jun 2021 11:50 PM PDT

On June 3, 2021, the US Supreme Court issued a much-anticipated decision interpreting the scope of the federal Computer Fraud and Abuse Act of 1986 (CFAA) not to cover situations in which the defendant was authorized to access information on a computer yet did so for an improper purpose. [1] The decision, which was widely expected after oral argument in November, carries a range of cybersecurity implications for businesses, such as the need to reassess employee access and website terms of service, while simultaneously narrowing the types of trade secret misappropriation that can be addressed through civil claims under the CFAA. In addition, one aspect of the Court's reasoning, involving the types of "damages" or "loss" required for civil CFAA claims, may further limit the damages available in civil claims brought under the CFAA to harm caused by the intrusion itself rather than any downstream harm caused by misappropriation or misuse of the information obtained.[2] At the same time, the decision leaves several key issues unresolved.

The CFAA and the Supreme Court's Decision in Van Buren

The CFAA creates a criminal offense, and also permits civil recovery against anyone who "intentionally accesses a computer without authorization or exceeds authorized access, and thereby obtains" information.[3] As amended, the CFAA applies broadly to any computer that connects to the Internet. [4]

In Van Buren, the Supreme Court resolved a split among federal circuit courts of appeal concerning just how broadly the law should apply. At issue was the meaning of the term "exceeds authorized access," which the CFAA defines as "to access a computer with authorization and to use such access to obtain or alter information in the computer that the accesser is not entitled so to obtain or alter."[5]

Van Buren involved the criminal conviction under the CFAA of Nathan Van Buren, a police sergeant in Georgia. Van Buren used his patrol-car computer to access a law enforcement database to run the license plates of a woman whom an acquaintance had met at a local strip club.[6] Cooperating with the FBI as part of a sting operation, the acquaintance promised to pay Van Buren $5,000 for the information.[7] Van Buren had been trained not to use the law enforcement database for improper purposes, such as personal use.[8] The Government obtained a conviction under the CFAA by arguing that, although he was permitted to access the database for law-enforcement purposes, Van Buren had unlawfully exceeded authorized access by using the database for a personal purpose.[9]

In a 6-3 decision, the Supreme Court disagreed and reversed the Eleventh Circuit's decision affirming Van Buren's conviction. The sole issue before the Supreme Court was whether Van Buren was "entitled so to obtain" the specific record from the law enforcement database.[10] There was no dispute that hewa s authorized to access the database when related to his duties as a law enforcement officer.[11]

The Court held that because Van Buren had authority to access the database as part of his job, he could not be convicted under the CFAA for misusing that access.[12] Writing for the majority, Justice Barrett explained that "an individual `exceeds authorized access' when he accesses a computer with authorization but then obtains information located in particular areas of the computer--such as files, folders, or databases--that are off limits to him."[13] Van Buren's conduct did not satisfy this standard because he did not access any area of a computer system that was "off limits" to him; instead, he merely accessed the law enforcement database with an improper purpose.[14]

The Court's decision significantly reduces the scope of conduct that the CFAA criminalizes and subjects to civil liability.

Implications for Cybersecurity

The decision has a number of cybersecurity implications for businesses, ranging from insider threats to vulnerability disclosure programs.

Review of Employee Access and Network Management. Many companies with sensitive and proprietary data provide their employees or contractors network access conditioned on appropriate business uses, identified in click-through banners, employment contracts, or employee handbooks. For example, an employee or contractor may be granted credentialed access to a company's intellectual property or other sensitive business information on the condition that they only use the access for approved purposes. But given the Court's interpretation of authorization under CFAA as a "gates-upor-down" concept,[15] companies may need to revisit these policies with Van Buren's holding in mind. In particular, if the goal is to deter harmful computer use through civil or criminal enforcement of the CFAA, revising contractual, policy, or banner language to explicitly bar access to certain content or files will be a critical step, potentially with accompanying technological restrictions.[16] In addition, companies concerned with potential insider threats will want to engage in a comprehensive review of their network management to identify which users have access to sensitive information, where that information is located, and whether the access is warranted. The effort would ideally involve both IT and legal departments, with the goal of ensuring that (1) sensitive data is clearly identified and separated within the network, and (2) authority to access data is clearly and unambiguously conveyed and/or technically restricted to only those with a need to know.

Data Scraping and Terms of Service. Many public-facing websites seek to limit third-party use (or misuse) of data through terms of service, as a way to prevent data scrapers or competitors from copying data or taking up bandwidth. But the Van Buren decision makes clear that the CFAA's "exceeds authorized access" provision does not apply to data scraping if the offending party is authorized to access the website at all, thereby limiting the legal effect of the terms of service.[17] Under Van Buren, the federal CFAA is not a remedy for website owners absent further action to revoke authorization from an offending party. As a practical matter, website owners that seek to protect their information may--at a minimum--need to take additional steps to monitor and detect efforts by scraping entities, identify the source, and clearly revoke all authorization (again, including through the use of technological measures). In the alternative, website owners will want to consider whether to shift from a publicly-accessible site to a gated one, in which they can exercise a greater degree of control over who may access the site in the first instance and more easily monitor and revoke authorization as desired.

Computer Security Research. Entities seeking to improve network and product security have increasingly turned to bug bounty programs, offering compensation to third-party computer security researchers for authorized discovery of vulnerabilities. The computer security research community largely favored Van Buren's position because of the chilling effect of a potential CFAA prosecution on their work. Now that the Court has adopted that view, however, entities engaged in a bug bounty program may want to revisit how their authorizations are structured and the technical accesses that are granted, in consultation with counsel.

Impact on Trade Secrets Litigation

As the law and technology developed since the CFAA was enacted in 1986, the CFAA's civil liability provisions ( 1030(g)) became an additional tool for victims of trade secret misappropriation that occurred by use of a computer. Before Van Buren, the First, Fifth, Seventh, and Eleventh Circuits had adopted the broad view of "exceeds authorized access" espoused by the Government in Van Buren. [18] In those circuits, an employee with access to his employer's computer network for business purposes could be prosecuted or sued under an "exceeding authorized access" theory if the employee accessed data for the employee's own self-interest.[19] This interpretation allowed trade secrets plaintiffs to bring CFAA claims alongside applicable trade secrets claims, and in some instances to prevail under the CFAA even where they could not meet all of the elements of a claim for trade secret misappropriation.

The Van Buren holding, which sided with the narrow reading that had been adopted by the Second, Fourth, Sixth, and Ninth Circuits, will significantly restrict those types of claims.[20] However, the CFAA after Van Buren still imposes civil liability for traditional hacking, as well as situations where an employee accesses a location--defined by Justice Barrett to include "files, folders, or databases"--on the employer's computer system that the employee is not permitted to access at all or that the employee accesses only after authorization had been terminated or revoked.

A Narrowed Definition of "Loss"?

The Van Buren decision may suggest an additional hurdle for trade secret and other plaintiffs seeking to prove civil liability under the CFAA. As part of its analysis explaining that the CFAA was designed to address traditional hacking, the Supreme Court articulated a view of the CFAA's civil "damage" and "loss" provisions that is narrower than the interpretation adopted among some lower courts.

Under the CFAA, plaintiffs pursuing a private cause of action must demonstrate, in addition to the other elements of liability, either "damage" or "loss."[21] The CFAA defines "damage" as "any impairment to the integrity or availability of data, a program, a system, or information,"[22] and "loss" as "any reasonable cost to any victim, including the cost of responding to an offense, conducting a damage assessment, and restoring the data, program, system, or information to its condition prior to the offense, and any revenue lost, cost incurred, or other consequential damages incurred because of interruption of service."[23] Reviewing these definitions as part of its assessment of the meaning of the CFAA's "exceeds authorized access" provision, the Supreme Court stated that the terms "damage" and "loss" "focus on technological harms--such as the corruption of files--of the type unauthorized uses cause to computer systems and data" and are "ill fitted . . . to remediating `misuse' of sensitive information that employees may permissibly access using their computers."[24]

Defendants facing a CFAA claim may point to the Court's comments about the definition of "loss" to try to further narrow the scope of CFAA's civil liability provisions. Before Van Buren, some courts had focused on the "any reasonable cost to any victim" portion of the definition of "loss" and concluded that "loss" could occur without any technological damage or interruption in service and cover the use (or misuse) of data obtained by a party in violation of the CFAA. For example, some district courts around the country had determined that "loss" under the CFAA means both (1) the costs associated with technical interruptions, restoring affected data, and responding to the violation, as well as (2) consequential losses from the misuse of data obtained by the defendant.[25] The Court's reasoning in Van Buren--though arguably only dicta--affirmatively endorses only the first type of loss. The Supreme Court's comments thus reinforce the need for trade secrets plaintiffs who seek to assert a claim under the CFAA to evaluate carefully the costs they have incurred as a result of the defendant's unauthorized access and to pay particular attention to gathering evidence of any costs related to remedying technological harms caused by the unauthorized access, and not just the misuse of data thereby obtained (which now may be outside the scope of recovery entirely).

Open Questions and What Comes Next

While this decision resolves a long-standing circuit split, the Court leaves open a number of important issues that are likely to be front and center in future CFAA litigation. First and foremost, the opinion expressly reserves whether the concept of "authorization" under the CFAA "turns only on technological (or `code-based') limitations on access," or "also looks to limits contained in contracts or policies."[26] In other words, Van Buren tells us that CFAA liability for "exceeds authorized access" turns on whether a party has the requisite authorization, but does not clarify what that authorization must consist of. How authorization is defined and communicated (and in particular, whether it requires code-based restrictions, such as password requirements) will almost certainly be the next significant dispute.

Second, and relatedly, the decision does not tell us what the default is with respect to authorization. In other words, must authorization be expressly given in order for computer resources to be accessed, or must it be expressly revoked to limit access? For example, it may be that a person browsing the public internet is presumed to have authorization to access any website that they can navigate to. But a person who is on a sensitive corporate system might be presumed to lack authorization unless explicitly granted. Whether authorization is a one-size-fits-all or a context dependent inquiry remains an open question.

Finally, it is important to keep in mind that the CFAA is an important computer crime law, but not the only one. Many states have CFAA analogs that are worded or interpreted differently. Some of them may clearly prohibit the type of computer misuse at issue in Van Buren, while others may use different language that the Court's reasoning does not reach. Whether and to what degree this opinion influences interpretation of analogous state laws remains to be seen.

Volcano Watch: What defines an eruption pause? - Hawaii Tribune-Herald

Posted: 06 Jun 2021 03:05 AM PDT

Kilauea's recent volcano alert-level change, from Watch to Advisory, has attracted some attention.

The June 1 USGS Hawaiian Volcano Observatory Kilauea weekly update summary reads: "Kilauea Volcano is no longer erupting. No surface activity has been observed … It is possible that the Halema'uma'u vent could resume eruption or that Kilauea is entering a period of quiescence prior to the next eruption."

ADVERTISING

We pick up the conversation where last week's Volcano Watch article left off, with a more detailed explanation of why a three-month-long window is useful in defining an eruption "pause." We'll look at this from both a global (statistical) perspective and a Kilauea (historical) perspective.

The Smithsonian Global Volcanism Project maintains a database of all known volcanic eruptions. This database provides the broad range of eruption statistics, including global averages of eruption frequency and pauses. For known eruptions that have been well-observed, a "pause" in activity within an eruption can typically last up to 90 days.

When a gap in activity lasts for longer than 90 days, it typically (but not always) becomes a much longer period of volcanic rest and can stretch from years to millennia (such as a frequently active volcano versus a sleepy stratovolcano). Any new eruptive activity thus becomes "the next eruption." A new eruption could begin in the same region — for example, "the summit region" — or in a different region like on a rift zone, and should be preceded by its own precursory unrest that is typical of that volcano.

If an eruption is to resume activity, it will often do so within the 90-day window and, typically (but not always), lava resumes erupting from the same vent. Reviewing Kilauea's recorded history since 1823, the Smithsonian's 90-day window of inactivity mostly holds true with one exception. A pause lasting 3.5 months occurred during the Maunaulu eruption of 1969–74.

The next longest pauses on Kilauea were recorded during the first three years (1983-1986) of the Pu'u'O'o eruption on Kilauea's middle East Rift Zone, where 48, short-lived high-fountain eruptions were separated by variable pauses that lasted days to months. The longest pauses were between the high-fountaining episodes 3–4 (65 days), episodes 32–33 (52 days), episodes 12–13 (50 days), episodes 39–40 (49 days), episodes 25–26 (43 days), and episodes 31–32 (38 days). The Kilauea Iki eruption in 1959 also had pauses lasting hours to several days between lava fountain episodes.

The pauses between episodic fountaining during these eruptions are also called "repose periods." HVO scientists were able to tell that the eruption had only paused because each fountaining episode was followed by predictable patterns of rapid inflation and escalating earthquake activity.

All other well-documented mid-eruption pauses during Kilauea eruptions resumed in a month or less. Recently, there were two pauses in Kilauea's 2018 lower East Rift Zone eruption. From May 9–12, 2018, a 63-hour-long pause ended with an eruption from a new vent, fissure 16. However, at Ahu'aila'au (fissure 8), there was a 15-day pause in lava effusion at the end of August 2018 before lava reappeared in Ahu'aila'au during Sept. 1–4. After a 90-day-window, HVO determined that the eruption was over. Kilauea entered a 2.25-year-long period of rest that ended with the summit fissure eruption in Halema'uma'u crater that began Dec. 20, 2020.

Kilauea's recent summit eruption within Halema'uma'u was determined to be paused on May 27, after a period with no visible lava, no rise of the lake surface, and decrease in sulfur dioxide (SO2) emissions. If the pause continues to August 24, it will likely mean this eruption is over.

In the past, numerous eruptions have taken place within Halema'uma'u crater — the home of Pele, the Hawaiian volcano-deity. Continued diligent monitoring of Kilauea by HVO will inform us over the next several months if the eruption will continue or if we must wait longer for the next eruption to begin. Quiescence between eruptions can last months to decades on Kilauea and HVO monitors Kilauea volcano closely for any signs of renewed activity.

Volcano activity updates

Kilauea Volcano is not erupting. Its USGS Volcano Alert level is at ADVISORY (https://www.usgs.gov/natural-hazards/volcano-hazards/about-alert-levels). Kilauea updates are issued weekly.

Lava supply to the Halema'uma'u lava lake has ceased and sulfur dioxide emissions have decreased to near pre-eruption background levels. Summit tiltmeters recorded slight, oscillating deflation-inflation cycles over the past week. Seismicity remains stable overall, with slightly increased earthquake counts and tremor over the past week. There are currently no indications suggesting that a resumption of volcanic activity is imminent. Kilauea remains an active volcano and future eruptions are possible at the summit or elsewhere on the volcano. For more information on current monitoring of Kilauea Volcano, see https://www.usgs.gov/volcanoes/Kilauea/monitoring.

Mauna Loa is not erupting and remains at Volcano Alert Level ADVISORY. This alert level does not mean that an eruption is imminent or that progression to an eruption from the current level of unrest is certain. Mauna Loa updates are issued weekly.

This past week, about 55 small-magnitude earthquakes were recorded below Mauna Loa; most of these occurred below the summit and upper-elevations at depths of less than 8 kilometers (about 5 miles). Global Positioning System measurements show low rates of deformation in the summit region over the past week. Gas concentrations and fumarole temperatures at both the summit and at Sulphur Cone on the Southwest Rift Zone remain stable. Webcams show no changes to the landscape. For more information on current monitoring of Mauna Loa Volcano, see: https://www.usgs.gov/volcanoes/mauna-loa/monitoring.

There were 4 events with 3 or more felt reports in the Hawaiian Islands during the past week: a M4.0 earthquake 42 km (26 mi) ESE of Naalehu at 10 km (6 mi) depth on June 2 at 6:44 p.m. HST, a M2.8 earthquake 5 km (3 mi) SSW of Volcano at 1 km (0 mi) depth on June 2 at 4:14 p.m. HST, a M3.4 earthquake 10 km (6 mi) NE of Pahala at 32 km (20 mi) depth on May 31 at 5:59 a.m. HST, and a M3.2 earthquake 18 km (11 mi) WNW of Kalaoa at 42 km (26 mi) depth on May 29 at 11:13 p.m. HST.

HVO continues to closely monitor both Kilauea and Mauna Loa for any signs of increased activity.

ADVERTISING

Please visit HVO's website for past Volcano Watch articles, Kilauea and Mauna Loa updates, volcano photos, maps, recent earthquake info, and more. Email questions to askHVO@usgs.gov.

Volcano Watch is a weekly article and activity update written by U.S. Geological Survey Hawaiian Volcano Observatory scientists and affiliates.

Nevada Seeks to Broaden Online Privacy Laws | Hinshaw Privacy & Cyber Bytes - Insights on Compliance, Best Practices, and Trends - JDSupra - JD Supra

Posted: 09 Jun 2021 01:28 PM PDT

On May 25, 2021, the Nevada legislature passed Senate Bill 260, which would amend the state's online privacy notice statutes. Sponsored by Nevada Senator Nicole Cannizzaro, the Bill will broaden Nevada's existing right to opt-out of sales of covered information. 

In the Bill, the definition of "sale" is amended to expand the types of activity that could be considered sales, providing greater protection to Nevada residents. Additionally, a new category of covered entities has been created named "data brokers." "Covered Information" is more narrowly defined than "Personal Information" in California's California Consumer Privacy Act. The Bill would also create a number of new exemptions.

To whom would it apply?

In addition to operators, the Bill identifies data brokers as a new category of covered entities. The Bill defines a data broker as "a person whose primary business is purchasing covered information about consumers with whom the person does not have a direct relationship and who reside in this State from operators or other data brokers and making sales of such covered information." 

The Bill would not apply to:

  • Consumer reporting agencies;
  • A person who collects, maintains, or makes sales of personally identifiable information for the purposes of fraud prevention;
  • Any personally identifiable information protected from disclosure under the federal Driver's Privacy Protection Act of 1994, 18 U.S.C. §§ 2721 et seq., which is collected, maintained or sold as provided in that Act; and
  • A person who does not collect, maintain, or make sales of covered information.

What types of information would it cover?

Covered Information includes any one or more of the following items of personally identifiable information about a consumer, collected by an operator through an Internet website or online service, and maintained by the operator or a data broker in an accessible form:

  • First and last name;
  • Home or other physical address which includes the name of a street and the name of a city or town;
  • Email address;
  • Telephone number;
  • Social security number;
  • An identifier that allows a specific person to be contacted either physically or online; and
  • Any other information concerning a person that is collected from the person through the Internet website or online service of the operator and maintained by the operator or data broker in combination with an identifier in a form that makes the information personally identifiable.

What rights would it create?

Under the Bill, sale now means "the exchange of covered information for monetary consideration by an operator or data broker to another person," which expands the types of activity that could be considered sales. Nevada's current law defines a sale narrowly as "the exchange of covered information for monetary consideration by the operator to a person for the person to license or sell the covered information to additional persons." 

The Bill would provide Nevada residents with greater rights to opt-out of sales of their personal information by requiring more companies to comply with notice requirements than under the current law.

What obligations would it impose?

Obligations for data brokers under the Bill include the following:

  • Each data broker shall establish a designated request address through which a consumer may submit a verified request to a data broker directing the data broker not to make any sale of any covered information about the consumer that the data broker has purchased or will purchase.
  • A data broker who has received a verified request submitted by a consumer shall not make any sale of any covered information about that consumer that the data broker has purchased or will purchase.
  • A data broker shall respond to a verified request submitted by a consumer within 60 days after receipt thereof.

How would it be enforced?

The Nevada Attorney General is tasked with enforcement of the related online privacy laws by instituting an appropriate legal proceeding against operators and data brokers. If a district court finds that a violation has occurred, the court may issue a temporary or permanent injunction or impose a civil penalty of up to $5,000 for each violation. 

Nevada law does not provide for a private right of action against operators. However, it is unclear whether a private right of action against data brokers is available.

Where does it stand? 

The Bill was signed by the Governor on June 2, 2021. It will go into effect on October 1, 2021. 

With this Bill, Nevada will join other states focusing on bringing transparency to data brokers and the data brokerage industry. Vermont and California, for example, have passed data broker registration laws (9 V.S.A. 2430 and Civil Code §1798.99.80, et. seq.). Vermont's Attorney General brought its first enforcement action alleging that Clearview AI, a data broker that uses facial recognition technology to map the faces of Vermonters, including children, and sells access to this data to private businesses, individuals, and law enforcement, violated the state's data broker law by fraudulently acquiring data through its use of screen scraping.

Comments

Popular posts from this blog

10 Best New Age Business Ideas - CT Post

COVID-19: New business ideas emerge as people work from home - The Jakarta Post - Jakarta Post

$10,000 up for grabs during annual Citadel competition for business ideas - ABC NEWS 4