PX14A6G 1 r426212px14a6g.htm

 

             2150 Kittredge St. Suite 450                    www.asyousow.org

Berkeley, CA 94704

   

BUILDING A SAFE, JUST, AND SUSTAINABLE WORLD SINCE 1992

 

Notice of Exempt Solicitation Pursuant to Rule 14a-103

 

Name of the Registrant: Facebook, Inc. Corporation (FB)
Name of persons relying on exemption: As You Sow
Address of persons relying on exemption: 2150 Kittredge St. Suite 450, Berkeley, CA 94704

 

Written materials are submitted pursuant to Rule 14a-6(g)(1) promulgated under the Securities Exchange Act of 1934. Submission is not required of this filer under the terms of the Rule, but is made voluntarily in the interest of public disclosure and consideration of these important issues.

 

 

 

Facebook, Inc. Corporation (FB)
Vote Yes: Item #8– Shareholder Proposal on Board Assessment of Misinformation Mitigation Practices

Annual Meeting: May 26, 2021

CONTACT: Andrew Behar| abehar@asyousow.org

 

 

THE RESOLUTION

 

Resolved: Shareholders request that the Board prepare a report to assess the benefits and drawbacks to our Company of maintaining or restoring the type of enhanced actions put in place during the 2020 election cycle to reduce the platform’s amplification of false and divisive information.

 

Supporting Statement: The report, at reasonable cost and omitting proprietary and privileged information could, at Board discretion, characterize and quantify the benefits or harms of such enhanced actions on, among other things:

 

Employee morale, recruitment, and retention;
The existence and impact of public boycott campaigns;
Legal and regulatory actions against the company related to content;
Revenue and earnings.

 

The report should be made available by December 2021.

 

 

 

SUMMARY

 

Over the past several years, Facebook has been criticized for proliferating false and divisive language. This has included Facebook being used as a platform for Libyan users to purchase arms1, a tool for Russian hackers to influence the 2016 US Presidential Election2, and a site where hate speech impacted anti-immigrant violence.3 The past six months have posed an additional burden to Facebook as they have had to reckon with hate speech and misinformation surrounding the November 2020 General Election and the Coronavirus pandemic.

 

_____________________________

1 https://www.seattletimes.com/nation-world/in-libya-facebook-is-used-to-buy-arms-locate-foes-and-kill-them/

2 https://www.nytimes.com/2017/11/01/us/politics/russia-2016-election-facebook.html

3 https://www.dw.com/en/new-study-shows-afd-facebook-posts-spur-anti-refugee-attacks/a-41972992

 

   
 

 

            

2021 Proxy Memo

Facebook | Shareholder Proposal on Board Assessment of Misinformation Mitigation Practices

 

 

During the November 2020 General Election, Facebook took important steps to ensure the site promoted correct information. This included altering Facebook's algorithm to emphasize N.E.Q. scores, or internal rankings based on the quality of news publishers.4 However, since the election, Facebook appears to have begun to remove some of these safeguards.5 It is currently unclear which policies remain from the 2020 General Election and which ones have been removed. Additionally, no internal research has been done to explore the successes and failures of these corporate policies to ensure Facebook is able to maintain a safe environment moving forward.

 

The lack of data on the efforts by Facebook during the 2020 General Election and the confusion on current practices could have detrimental long-term effects for shareholders and outside stakeholders. Facebook has a responsibility to its investors to ensure the site is safe and remains well-respected. An analysis of past and current efforts at mitigating misinformation is a vital step in this process.

 

RATIONALE FOR A YES VOTE

 

1.Facebook does not provide sufficient analysis to permit it or its shareholders to assess the benefits and drawbacks of maintaining, restoring, or changing the type of enhanced actions it put in place during the 2020 election cycle to reduce the platform’s amplification of false and divisive information.
2.Any assessment of success in policies put in place since the 2020 General Election should be clearly stated and be driven by data on outcome, not on the number or type of policies put in place.
3.The proliferation of harmful information on Facebook, and a negative public perception, hurts the company in terms of advertising, user retention, and employee morale.
4.Regulatory trends pose a long-term risk to Facebook.

 

DISCUSSION

 

1.Facebook does not provide sufficient analysis of efforts to prevent misinformation and divisive rhetoric on its platform.

 

One of the most important aspects of success for social media companies is the preservation of trust among users. When trust is broken, it can lead to reputational damage and have lasting negative impact on the goals of maintaining a strong user base and strong advertising partnerships. According to Ernst & Young, the most effective way for social media companies to fight incorrect news is through establishing trust with their users; this is done by “building a culture of integrity, compliance, and ethics that support accurate and well-verified information.”6 To ensure compliance, our Company must take internal steps to ensure that its programs to reduce hate and disinformation are effective. This will in turn increase trust among users and will ensure Facebook remains a reputable company.

 

_____________________________

4 https://www.nytimes.com/2020/11/24/technology/facebook-election-misinformation.html

5 https://www.nytimes.com/2020/12/16/technology/facebook-reverses-postelection-algorithm-changes-that-boosted-news-from-authoritative-sources.html

6 https://www.ey.com/en_us/forensic-integrity-services/how-media-organizations-can-get-real-and-confront-fake-news

 

  2
 

 

            

2021 Proxy Memo

Facebook | Shareholder Proposal on Board Assessment of Misinformation Mitigation Practices

 

 

In terms of measuring compliance, Facebook is falling behind standards necessary to uphold its status among users. Laura Ederson, an NYU researcher who belongs to Cybersecurity for Democracy, criticized Facebook for not being transparent with its data on minimizing conspiracies and hate speech. She said, “they can't say their data leads to a different conclusion but then not make that data public.”7 A Board Assessment of policies would be instrumental in ensuring the public that our Company is doing everything in its power to limit the proliferation of divisive or untrue speech.

 

2.Changes in policies from the 2020 General Election must be clearly stated and be driven by data.

 

The lack of analysis of policies is not the only concern that necessitates this Board Assessment. Another significant problem lies in the fact that, while Facebook boasts many positive steps taken during the 2020 Election, it has apparently already begun to remove some of these practices. Facebook’s positive response to the concerns of stakeholders during the Election was vital to upholding the company’s reputation, and thus a removal of such policies should not be done without a data-driven understanding of the impact of doing so. In a NewVantage Partners survey of Fortune 1,000 senior executives, the survey found “highly data-driven organizations are three times more likely to report significant improvements in decision-making compared to those who rely less on data.”8 Given the importance of data based decision-making, shareholders deserve to have a clear understanding of not only how Facebook is fighting against misinformation, but how effective those programs are. Decisions should be made by analyzing outcomes -- and the effectiveness of the programs -- not merely by stating the number of programs or actions taken, or the number of pieces of information removed.

 

_____________________________

7 https://www.npr.org/2021/03/06/974394783/far-right-misinformation-is-thriving-on-facebook-a-new-study-shows-just-how-much

8 https://online.hbs.edu/blog/post/data-driven-decision-making

 

  3
 

 

            

2021 Proxy Memo

Facebook | Shareholder Proposal on Board Assessment of Misinformation Mitigation Practices

 

3.The proliferation of harmful information on Facebook and a negative public perception hurts the company in terms of advertising, user retention, and employee morale.

 

A negative perception of Facebook due to the persistence of incorrect, divisive, and sometimes dangerous information impacts the company negatively in several ways. First, it poses a significant burden to the Company in maintaining a positive relationship with advertisers. In July, a month-long advertising boycott by many major corporations, unified under the hashtag #StopHateForProfit, signified that companies were unhappy with the level of distrust on Facebook. While the official boycott has since ended, several top advertisers including Verizon, Coca-Cola, Clorox, and HP have maintained their boycott. Many other companies, such as Target, Nike, and Hershey, have intentionally reduced their advertising spending on Facebook.9

 

Divisiveness on Facebook also harms user retention, and many individuals have joined the #DeleteFacebook campaign. This campaign calls on individuals to delete their Facebook accounts, which impacts Facebook by harming credibility.10 The Media Search Group warns of the potential harm of these campaigns, saying that they erode the user base over time and harm the reputation of the company.11

 

As Facebook’s reputation is damaged it also decreases employee morale which could increase risk to the company and shareholders. According to a report by the New York Times, “about half [of employees] felt that Facebook was having a positive impact on the world, down from roughly three-quarters earlier this year.” On top of this, employee trust in leadership, and their intent to remain at the company, decreased as well. Such a significant decrease in employee satisfaction could be harmful for stockholders. Research shows that there is a strong correlation between company morale and productivity, which could in turn affect profits.12

 

The impact of reputational damage to Facebook increases the risk for shareholders in the long run. Corporations, individuals, and employees are signifying that change within our company must occur. Given the ability for incorrect news and divisiveness to cause widespread harm to Facebook and create risk for shareholders, a report into the effectiveness of mitigation practices is warranted.

 

4.Regulatory trends could pose a long-term risk to Facebook.

 

Facebook has historically been protected from liability for hate speech and incorrect information by Section 230 of the Communications Decency Act. This act, passed in 1996, shields internet companies from liability on their sites by stating that they are not to be regarded as the publisher of information, but merely a platform for the information. However, many legislators have been pushing to recall this Act. Members of Congress from both political parties, including Republican Senator from Texas Ted Cruz and Democratic Speaker from California Nancy Pelosi, have hinted that this act could be removed.13 President Joe Biden spoke on this issue, saying, “the idea . . . is that Section 230 should be revoked, immediately should be revoked, number one. For Zuckerberg and other platforms.14 Given that changing or removing Section 230 of the Communications Decency Act has become a more realistic possibility, it would be wise to have an existing understanding of practices to ensure mitigation strategies are working. This would prevent potential liability in the future.

 

_____________________________

9 https://fortune.com/2020/11/07/facebook-ad-boycott-big-brands-lego-clorox-verizon-microsoft-hp/

10 https://mashable.com/article/delete-facebook-2020/

11 https://www.mediasearchgroup.com/blog/how-will-deletefacebook-campaign-impact-on-social-media-marketing/

12 https://www.researchgate.net/publication/309494308_Correlation_of_Morale_Productivity_and_Profit_in_Organizations

13 https://www.nytimes.com/2020/05/28/business/section-230-internet-speech.html

14 https://www.theverge.com/2020/1/17/21070403/joe-biden-president-election-section-230-communications-decency-act-revoke

 

  4
 

 

            

2021 Proxy Memo

Facebook | Shareholder Proposal on Board Assessment of Misinformation Mitigation Practices

 

 

RESPONSE TO FACEBOOK BOARD OF DIRECTORS’ STATEMENT IN OPPOSITION

 

Facebook’s Opposition Statement asserts that necessary steps are being taken to ensure transparency and that our Company is already working to minimize platform misuse. We agree that Facebook has taken steps during the 2020 Presidential Election and thereafter to ensure a safe and trusted platform. However, given the rollback of some of these policies, it is unclear exactly what Facebook’s practices will look like moving forward. Additionally, while the opposition statement included information such as the number of pieces of harmful content removed, it lacked context about the scope of the problem and how much harmful content may remain, and what overall impact its policies are having. Without an analysis characterizing and quantifying the benefits or harms of current actions and programs, the public, employees, advertisers, and shareholders are left to guess.

 

In the opposition statement, the Board states that current fact-checking services are sufficient at minimizing false information on Facebook. However, many still question the effectiveness of programming currently in place. An article based on research by the UK organization Full Fact asserts that the fact-checking process discussed in the opposition statement may not be as effective as claimed.15 Because questions still exist regarding the success of these programs, an analysis is warranted. Stakeholders are happy with the steps taken during the election, but a data-driven assessment of these and current practices would promote trust between our Company and Facebook users.

 

CONCLUSION

 

Vote “Yes” on this Shareholder Proposal seeking a report assessing the benefits and drawbacks

 

By supporting this proposal, shareholders will underscore the importance of Facebook assessing the success or failure of its efforts to manage misinformation and divisiveness on its platform. A failure to stem such posts and news creates significant risk of reputational damage. As Facebook continues to fight against hate speech and incorrect news on its platform, it is essential to have a greater understanding of the impact of such efforts both during the 2020 General Election and after. Information from this assessment could guide future decisions at Facebook and promote a positive reputation, minimizing the long-term risk for investors.

 

_____________________________

15 https://www.newscientist.com/article/2211634-facebooks-fact-checking-process-is-too-opaque-to-know-if-its-working/

 

  5
 

 

            

2021 Proxy Memo

Facebook | Shareholder Proposal on Board Assessment of Misinformation Mitigation Practices

 

 

--

 

For questions, please contact Andrew Behar| abehar@asyousow.org

 

THE FOREGOING INFORMATION MAY BE DISSEMINATED TO SHAREHOLDERS VIA TELEPHONE, U.S. MAIL, E-MAIL, CERTAIN WEBSITES AND CERTAIN SOCIAL MEDIA VENUES, AND SHOULD NOT BE CONSTRUED AS INVESTMENT ADVICE OR AS A SOLICITATION OF AUTHORITY TO VOTE YOUR PROXY. THE COST OF DISSEMINATING THE FOREGOING INFORMATION TO SHAREHOLDERS IS BEING BORNE ENTIRELY BY ONE OR MORE OF THE CO-FILERS. PROXY CARDS WILL NOT BE ACCEPTED BY ANY CO-FILER. PLEASE DO NOT SEND YOUR PROXY TO ANY CO-FILER. TO VOTE YOUR PROXY, PLEASE FOLLOW THE INSTRUCTIONS ON YOUR PROXY CARD.

 

 

6