New Media and Technology Law Blog

Financial Data Aggregator Faces Consumer Privacy Suit over “Surreptitious” Collection of Banking Information

Last week, a putative privacy-related class action was filed in California district court against financial analytics firm Envestnet, Inc. (“Envestnet”), which operates Yodlee, Inc. (“Yodlee”). (Wesch v. Yodlee Inc., No. 20-05991 (N.D. Cal. filed Aug. 25, 2020)). According to the complaint, Yodlee is one of the largest financial data aggregators in the world and through its software platforms, which are built into various fintech products offered by financial institutions, it aggregates financial data such as bank balances and credit card transaction histories from individuals in the United States. The crux of the suit is that Yodlee collects and then sells access to such anonymized financial data without meaningful notice to consumers, and stores or transmits such data without adequate security, all in violation of California and federal privacy laws.

The timing of this case is interesting, as it comes on the heels of the recent settlement of the litigation the between the City Attorney of Los Angeles and the operator of a weather app over claims that locational information collected through the weather app was being sold to third parties without adequate permission from the user of the app. Continue Reading

Eclipsed by Evolving Law, Policy and Technology, Seminal Mobile Location Data Case Settled

This past week, the operator of the popular Weather Channel (“TWC”) mobile phone app entered into a Stipulation of Settlement with the Los Angeles City Attorney, Mike Feuer (“City Attorney”), closing the books on one of the first litigations to focus on the collection of locational data through mobile phones. (People v. TWC Product and Technology, LLC, No. 19STCV00605 (Cal. Super., L.A. Cty, Stipulation Aug. 14, 2020)). While the settlement appears to allow TWC to continue to use locational information for app-related services and to serve advertising (as long the app includes some agreed-upon notices and screen prompts to consumers), what is glaringly absent from the settlement is a discussion of sharing locational information with third parties for purposes other than serving advertising or performing services in the app. Because applicable law, industry practice and the policies of Apple and Google themselves have narrowed the ability to share locational information for such purposes, the allegations of the case were, in a sense, subsumed in the tsunami of attention that locational information sharing has attracted. While some are viewing this settlement as a roadmap for locational information collection and sharing, in fact the settlement is quite narrow. Continue Reading

Commerce Dept. Petitions FCC to Issue Rules Clarifying CDA Section 230

The currents around the Communications Decency Act just got a little more turbulent as the White House and executive branch try to reel in the big fish of CDA reform.

On July 27, 2020, the Commerce Department submitted a petition requesting the FCC initiate a rulemaking to clarify the provisions of Section 230 of the Communications Decency Act (CDA). Unrelated, but part of the same fervor in Washington to “rein in social media,” the leaders of the major technology companies appeared before the House Judiciary Antitrust Subcommittee at a hearing yesterday, July 29, 2020, to discuss the Committee’s ongoing investigation of competition in the digital marketplace, where some members inquired about content moderation practices. Moreover, last month, a pair of Senators introduced the PACT Act, a targeted (but still substantial) update to the CDA (and other CDA reform bills are also being debated, including a bill to carve out sexually exploitative material involving children from the CDA`s reach). Continue Reading

The Communication Decency Act and the DOJ’s Proposed Solution: No Easy Answers

Section 230 of the Communications Decency Act (“CDA”), 47 U.S.C. §230, enacted in 1996, is often cited as the most important law supporting the Internet, e-commerce and the online economy. Yet, it continues to be subject to intense criticism, including from politicians from both sides of the aisle. Many argue that the CDA has been applied in situations far beyond the original intent of Congress when the statue was enacted. Critics point to the role the CDA has played in protecting purveyors of hate speech, revenge porn, defamation, disinformation and other objectionable content.

Critics of the CDA raise valid concerns.  But what is the right way to address them? One must remember that for organizations that operate websites, mobile apps, social media networks, corporate networks and other online services, the CDA’s protections are extremely important.  Many of those businesses could be impaired if they were subject to liability (or the threat of liability) for objectionable third party content residing on their systems.

The criticism surrounding the CDA hit a fever pitch on May 28, 2020 when the President weighed in on the issue by signing an Executive Order attempting to curtail legal protections under Section 230. While the Executive Order was roundly labelled as political theater – and is currently being challenged in court as unconstitutional – it notably directed the Justice Department to submit draft proposed legislation (i.e., a CDA reform bill) to accomplish the policy objectives of the Order. This week, on June 17, 2020, the DOJ announced “that the time is ripe to realign the scope of Section 230 with the realities of the modern internet” and released a document with its recommendations for legislative reform of Section 230.  This is on the heels of a recent initiative by several GOP lawmakers to introduce their own version of a reform bill. Continue Reading

Wholesale Scraping of “Public” Data May Be Trade Secret Misappropriation

In what could be prove to be an important decision within the context of scraping of “public” data, in a recent case the Eleventh Circuit reversed a lower court’s dismissal of trade secret claims relating to the scraping of insurance quotes. (Compulife Software, Inc. v. Newman, No. 18-12004 (11th Cir. May 20, 2020)). The appellate court agreed with the lower court that while Compulife’s insurance quote database was a trade secret, manually accessing life insurance quote information from the plaintiff’s publicly web-accessible database would generally not constitute the improper acquisition of trade secret information.  However, the court disagreed with the lower court in finding that the use of automated techniques to scrape large portions of the database could constitute “improper means” under state trade secret law.  In reversing the lower court’s dismissal of the trade secret claims, the appeals court stressed that “the simple fact that the quotes taken were publicly available does not automatically resolve the question in the defendants’ favor.”   Even though there was no definitive ruling in the case – as the appeals court remanded the case for further proceedings – it is certainly one to watch, as there are very few cases where trade secrets claims are plead following instances of data scraping.

Continue Reading

President Signs Executive Order Directing Agencies to Probe the Contours of CDA Immunity

President Trump signed an Executive Order today attempting to curtail legal protections under Section 230 of the Communications Decency Act (“Section 230” or the “CDA”). The Executive Order strives to clarify that Section 230 immunity “should not extend beyond its text and purpose to provide protection for those who purport to provide users a forum for free and open speech, but in reality use their power over a vital means of communication to engage in deceptive or pretextual actions stifling free and open debate by censoring certain viewpoints.”

Section 230 protects online providers in many respects concerning the hosting of user-generated content and bars the imposition of distributor or publisher liability against a provider for the exercise of its editorial and self-regulatory functions with respect to such user content.  In response to certain moderation efforts toward the President’s own social media posts this week, the Executive Order seeks to remedy what the President claims is the social media platforms’ “selective censorship” of user content and the “flagging” of content that does not violate a provider’s terms of service.

The Executive Order does a number of things. It first directs:

  • The Commerce Department to file a petition for rulemaking with the FCC to clarify certain aspect of CDA immunity for online providers, namely Good Samaritan immunity under 47 U.S.C. §230(c)(2). Good Samaritan immunity provides  that an interactive computer service provider may not be made liable “on account of” its decision in “good faith” to restrict access to content that it considers to be “obscene, lewd, lascivious, filthy, excessively violent, harassing or otherwise objectionable.”
    • The provision essentially gives providers leeway to screen out or remove objectionable content in good faith from their platforms without fear of liability. Courts have generally held that the section does not require that the material actually be objectionable; rather, the CDA affords protection for blocking material that the provider or user considers to be “objectionable.” However, Section 230(c)(2)(A) requires that providers act in “good faith” in screening objectionable content.
    • The Executive Order states that this provision should not be “distorted” to protect providers that engage in “deceptive or pretextual actions (often contrary to their stated terms of service) to stifle viewpoints with which they disagree.” The order asks the FCC to clarify “the conditions under which an action restricting access to or availability of material is not ‘taken in good faith’” within the meaning of the CDA, specifically when such decisions are inconsistent with a provider’s terms of service or taken without adequate notice to the user.
    • Interestingly, the Order also directs the FCC to clarify if there are any circumstances where a provider that screens out content under the CDA’s Good Samaritan protection but fails to meet the statutory requirements should still be able to claim protection under CDA Section 230(c)(1) for “publisher” immunity (as, according to the Order, such decisions would be the provider’s own “editorial” decisions).
    • To be sure, courts in recent years have dismissed claims against services for terminating user accounts or screening out content that violates content policies using the more familiar Section 230(c)(1) publisher immunity, stating repeatedly that decisions to terminate an account (or not publish user content) are publisher decisions, and are protected under the CDA. It appears that the Executive Order is suggesting that years of federal court precedent – from the landmark 1997 Zeran case until today – that have espoused broad immunity under the CDA for providers’ “traditional editorial functions” regarding third party content (which include the decision whether to publish, withdraw, postpone or alter content provided by another) were perhaps decided in error.

Continue Reading

President to Unveil Executive Order to Address CDA Section 230 Protections

UPDATE: On the afternoon of May 28, 2020, the President signed the executive order concerning CDA Section 230. A copy/link to the order has not yet been posted on the White House’s website.

 

According to news reports, the Trump Administration (the “Administration”) is drafting and the President is set to sign an executive order to attempt to curtail legal protections under Section 230 of the Communications Decency Act (“Section 230” or the “CDA”). Section 230 protects online providers in many respects concerning the hosting of user-generated content and bars the imposition of distributor or publisher liability against a provider for the exercise of its editorial and self-regulatory functions with respect to such user content. In response to certain moderation efforts toward the President’s own social media posts this week, the executive order will purportedly seek to remedy what the President claims is the social media platforms’ “selective censorship” of user content and the “flagging” of content that is inappropriate, “even though it does not violate any stated terms of service.”

A purported draft of the executive order was leaked online. If issued, the executive order would, among other things, direct federal agencies to limit monies spent on social media advertising on platforms that violate free speech principles, and direct the White House Office of Digital Strategy to reestablish its online bias reporting tool and forward any complaints to the FTC. The draft executive order suggests that the FTC use its power to regulate deceptive practices against those platforms that fall under Section 230 to the extent they restrict speech in ways that do not match with posted terms or policies.  The order also would direct the DOJ to establish a working group with state attorneys general to study how state consumer protection laws could be applied to social media platform’s moderation practices.  Interestingly, the executive order draft would also direct the Commerce Department to file a petition for rulemaking to the FCC to clarify the conditions when an online provider removes “objectionable content” in good faith under the CDA’s Good Samaritan provision (which is a lesser-known, yet important companion to the better-known “publisher” immunity provision). Continue Reading

French Data Protection Authority Speaks to Web Scraping

Late last month, the French data protection authority, the CNIL, published guidance surrounding considerations behind what it calls “commercial prospecting,” meaning scraping publicly available website data to obtain individuals’ contact info for purposes of selling such data to third parties for direct marketing purposes.  The guidance is noteworthy in two respects.  First, it speaks to the CNIL’s view of this activity in the context of the GDPR and privacy concerns.  Second, and of more general interest, the guidance lays out some guiding principles for companies that conduct screen scraping activities or hire outside vendors to collect and package such data.

You can read our summary of the CNIL guidance on our firm’s Privacy Law Blog.

Protecting Business Information Assets in the “Work From Home” Environment

This past March, many organizations were forced to suddenly pivot to a “work from home” environment (“WFH”) as COVID-19 spread across our country.  However, many companies did not have the necessary technical infrastructure in place to support their full workforce on a WFH basis.  Often, remote access systems were configured assuming only a portion of a company’s employees – not 100% of a company’s employees – would be remotely accessing the corporate networks simultaneously.  In addition, many employees have limited home Wi-Fi capacity that is insufficient to sustain extended, robust connections with the office systems.  Networks can then become overloaded, connections dropped, and employees can experience extended latency issues, frozen transmissions and the like.

As a result, many employees are using a work-around — often with their employer’s knowledge and approval.  They connect their personal devices to their employer’s network to download what they need from the network, but disconnect to perform the bulk of their work offline.  On a periodic basis and upon the completion of the task at hand, those employees then typically upload or distribute the work product to the organization’s network.

Continue Reading

LexBlog