State International Approaches and Challenges

Can Internet platforms be better regulated?

The issue of Internet regulation is one that has been a part of discourse around Internet since the 1990s when Internet became increasingly accessible to the public and different viewpoints have been offered on how far Internet can be regulated or even if it ought to be regulated by the state. Even if it is agreed that Internet can be and should be regulated by the law, there are peculiar characteristics of Internet including technology, geographical distribution of the users, and the nature of its content that make it difficult for states to regulate it. There is also an argument made that instead of state regulation, it may be more appropriate for there to be an international approach to regulation of Internet through treaties and other international mechanisms. But, this approach is difficult to apply because countries may have different approaches to implementing international law; for example, the United States give constitutional primacy to domestic law over international law unless the latter is ratified.

The challenges associated with the regulation of Internet through the medium of international law can be seen in the fact that there is no consensus reflected in any major international treaty for regulation of Internet. At the same time, the permeating influence of Internet may require some forms of regulation for the purpose of providing individuals the protection of the law in commercial and non-commercial activities undertaken on the Internet. In this essay, a critical analysis is conducted on whether Internet platforms are adequately regulated and whether there are ways in which Internet platforms can be better regulated. This essay argues that there is a sounder case made for a combination of self-regulation and statutory regulation rather than just one of these methods for regulatory purposes. Furthermore, the current approach to regulation in the UK is moving towards such hybrid regulatory approach as compared to the American approach which is tilted more in the favour of maintaining free market for the digital businesses. Based on the discussion in this essay, the argument is made out for better regulation of Internet than it is being done at this time.

The Internet has existed since 1969, but it was only in the 1990s that it saw significant growth and since then it has gone on to become ubiquitous in the human society. From 1969 to the 1990s, the Internet was a network only used in the United States under the Advance Research Project Agency (ARPANET), which was used by the military, defence contractors, and university laboratories conducting defence-related research, and which later was expanded to connect

Whatsapp
  1. Laura Denardis, Global War for the Internet Governance (Yale University Press 2014).
  2. J Boyle, ‘Foucalt in Cyberspace: Surveillance, Sovereignty and Hardwired Sensors’ (1997) 66 U. Cin. L. Rev. 177.
  3. Judge Stein Schjolberg, An International Criminal Court or Tribunal for Cyberspace. A paper for the EastWest Institute (EWI) Cybercrime Legal Working Group (EastWest Institute 2011).
  4. Andrew Murray, The regulation of cyberspace: control in the online environment (Routledge 2007).
  5. Matthias C. Kettemann, The Normative Order of the Internet: A Theory of Rule and Regulation Online (Oxford University Press 2020).
  6. Ibid.
  7. universities, researchers and others worldwide. As such, the question of regulation of Internet was not one that related to protection of the public from harm due to the content available to them from the Internet as there was not much public exposure to the Internet. This is the not the case today. Internet is now easily accessible to significant proportion of the world population, for both commercial and non commercial purposes. Like any other market space, Internet can be used for both licit and illicit purposes, which begs the question of regulation of the Internet platforms. In ACLU v Reno, the US Supreme Court accepted the nature of the Internet as a "giant network interconnected with a series of smaller networks." By the 1990s, and more so today, the Internet is a site of interconnectedness worldwide, where ease of navigation and access to content means that a significant proportion of the world population has access to vast amounts of content and sites. The size of the Internet, and the nature of its interconnectedness poses significant challenges in terms of regulation. The challenging nature of Internet does not mean that it cannot be regulated. However, the starting point of any discussion on regulation would be to identify ‘who’ needs to be regulated: platform or individual users? Individual users already come under the scope of legislative frameworks for liability for illegal content or misinformation, but the question of platform regulation is more complex.

    First, to understand the meaning of the term ‘platform’, there is no generally accepted definition of the term, and it is a term that is used to distinguish a platform from other forms of online presence, such as, individuals, on the basis of the platform’s facilitation of “provision and access to products, information, entertainment, opinions, sales, advertising or other content or services from a variety of sources.” The term platform also becomes relevant to distinguishing between content managed by a platform and content provided by an individual user. The term platform is also used for ‘online content intermediaries’. An important question is whether intermediaries are to be considered platforms or publishers and there is often some problem with delineating the scope of the definitions of platforms and intermediaries. The term platform is generally used with respect to all intermediaries, but the term ‘platform’ does not appear in the relevant European legislation and instead the term ‘online content intermediary’ is used to describe a subset of ‘hosting’ providers by the E-Commerce Directive. In L’Oreal v eBay, the term ‘active’ host was used and may mean something similar to intermediary. The E-Commerce Directive defines three types of intermediary: ‘mere conduits’, ‘caching providers’ and ‘hosts’. The EU Commission has proposed the Digital Services Act, which differentiates between Intermediary services (internet access providers, domain


  8. Dharmesh S Vashee, ‘ACLU v. Reno: Congress Places Speed Bumps on the Information Superhighway’ (2000) 6(3) Richmond Journal of Law & Technology 16.
  9. ACLU v Reno, 929 F. Supp. 824 (1996), [830].
  10. Hogan Lovells, Liability regulation of online platforms in the UK: A White Paper (April 2018) accessed < https://www.hoganlovells.com/~/media/hogan-lovells/pdf/2018/google-online-platforms-white-paper.pdf> p. 7.
  11. C-324/09, L’Oréal SA and Others v eBay International AG
  12. Mark Bunting, ‘Keeping Consumers Safe Online’ (2018) accessed
  13. name registrars), Hosts, Online platforms (app stores and social media platforms), and large online platforms (platforms reaching more than 10% of monthly European consumers).

    It can be argued that the first step towards improving the framework on regulation of the Internet would be to clarify on who is to be regulated in terms of platforms because at this time there is little clarity on how platforms are defined across different jurisdictions. In the UK, the term ‘online intermediaries’ is used and even with the wide scope of actors who come within this definition, it has been suggested that the existing definitions do not effectively delineate the full spectrum of actors that are involved in the internet’s architecture and can facilitate and participate in wrongdoing. Furthermore, Internet regulation is territorially fragmented because different jurisdictions have different definitions of platforms and intermediaries and different standards of regulation, which can either lead to intermediaries being able to avoid liability in some cases and attract liability in others.

    There are three identifiable characteristics of online content intermediaries, which are that they: operate open marketplaces through direct interaction between suppliers and consumers of information and content; play an active role in matching content to users; and earn revenue by taking a share of the value created by the platforms. Intermediaries do not simply allow people to use their platforms to upload content, but play a role in moderating content and choosing what kinds of content may get promoted over the others. This is one of the reasons why it is important to affix liability to intermediaries. An important point is that intermediaries are in the position to put an end to harmful or illegal activity because of their capacity to detect, prevent and control the means of wrongdoing.

    Regulation of platforms of intermediaries also has been thought to be necessitated by the fact that these actors play a role in moderating information and content, as noted recently by the Council of Europe:

    “The power of such intermediaries as protagonists of online expression makes it imperative to clarify their role and impact on human rights, as well as their corresponding duties and responsibilities, including as regards the risk of misuse by criminals of the intermediaries’ services and infrastructure… States are confronted with the complex challenge of regulating an environment in which private parties fulfil a crucial role in providing services with significant public service value.”


  14. Ethan Shattock, ‘Self-regulation 2: 0? A critical reflection of the European fight against disinformation’ (2021) Harvard Kennedy School Misinformation Review, accessed
  15. Jaani Riordan,The liability of internet intermediaries (Oxford University Press 2016).
  16. Catherine Stromdale, ‘Regulating Online Content: A Global View’ (2007) 13 Computer and Telecommunications Law Review 173.
  17. Bunting, ‘Keeping Consumers Safe Online’, supra n 11.
  18. Ibid.
  19. Riordan,The liability of internet intermediaries, supra n 13.
  20. Council of Europe, Recommendation on the roles and responsibilities of internet intermediaries (Recommendation CM/Rec(2018), 2 March 2018) accessed
  21. Therefore, there is a justifiable argument in favour of regulating intermediaries or online platforms. The question is how such regulation should be put in effect. At this point, it is also important to also engage with the theory on Internet regulation. The regulation of Internet is made complex by the nature of Internet as a vast, interconnected space without borders. Due to this, it has also been argued that cyberspace as a global electronic social space is a site where national governments do not have a moral right to rule and do not have efficient methods of enforcement. There are multiple and overlapping systems of rules or 'interleaflet' applicable to Internet, which makes it inappropriate for any state to justifiably claim comprehensive law-making in this area. In light of this background, two prominent theories of Internet regulation have come to be propounded in the literature on Internet regulation, these are cyber-libertarianism and cyber-paternalism, and they offer contrasting views on Internet regulation. Although both theoretical approaches are premised on the viewpoint that the Internet is a unique form of communication, they offer different answers to the question of how far and in what way Internet should be regulated by the law. Cyber-libertarianism perspective argues that regulation by a state is not appropriate because there are no territorial boundaries on the Internet, and instead of state regulation, it is more appropriate that norms of regulation are defined by the digital community. In other words, cyber-libertarianism approach emphasises on self-governance of Internet. This kind of approach to Internet regulation has been called a “bottom-up private ordering” of Internet, which avoids the need for regulation by a bureaucratic state.

    The cyber- libertarianism approach has been opposed by cyber-paternalism, which takes forms of cyber-realism and techno-determinism. Cyber-paternalism is essentially an umbrella term that comprises both cyber-realism and techno-determinism. Cyber-realism argues that Internet can be regulated based on traditional jurisdiction and law. Techno-determinism posits that the idea that Internet cannot be regulated, also termed as Internet exceptionalism, is not based on the impossibility of regulating Internet, but the practical challenges associated with enforcement of regulatory norms in the Internet. Lawrence Lessig, who proposes a cyber-paternalistic approach to Internet regulation argues that by re-reading the traditional regulatory performance with Internet characteristics and architecture, and relating this to the markets, law, and norms around Internet, it is possible to regulate the Internet through state made law. The architecture of Internet is unique, but it has a capacity to engender rights and duties, which makes it possible to also


  22. John Perry Barlow, ‘Declaration of the Independence of Cyberspace’ (1996) accessed
  23. Chris Reed and Andrew Murray, Rethinking the Jurisprudence of Cyberspace (Edward Elgar 2018) 14.
  24. DR Johnson and D Post, ‘Law and borders: the rise of law in cyberspace’ (1996) 48(5) Stanford Law Review 1367.
  25. NW Netanel, ‘Cyberspace Self-Governance: A Skeptical View from Liberal Democratic Theory’ (2000) 88 Calif L. Rev. 395.
  26. Chris Reed, Law: Text and Materials (Cambridge University Press 2004).
  27. Joel. R. Reidenberg, ‘Lex Informatica: The Formation of Information Policy Rules Through Technology’ (1998) 76 (3) Texas Law Review 553.
  28. Lawrence Lessig, ‘The Law of the Horse: What Cyber Law Might Teach’ (1999) 113(501) Harvard Law Review 506.
  29. Take a deeper dive into Should the Central Bank be Independent with our additional resources.

    regulate the Internet through the architecture of these rights and duties. An approach that seeks to take a balanced view to state regulation and self-regulation is proposed by Andrew Murray, which posits that Internet being a site for communication and discourse, only direct legal-regulatory control is not appropriate for regulation, and other actors and stakeholders can also provide means of regulation. Clearly, there is a division in the discourse around Internet regulation, with divergent theoretical approaches on how such regulation can take place. Some approaches deny the ethical basis for such regulation, some accept the power of the state to make such regulation, while some argue for a middle way approach where government and other actors all play a role in norm building for Internet regulation. In the next sections of the essay, a critical and comparative discussion is undertaken on how states have responded to Internet regulation through their laws and policies with the view to identifying how Internet can be regulated and has been regulated in different jurisdictions. The argument is that adopting a middle approach to regulation, where some aspects of regulation are undertaken by the state legislation and other aspects of regulation are undertaken through a self-regulation method offers a more effective and nuanced response to regulation of platforms.

    One of the early statutes on Internet regulation is found in the United States, where the Congress enacted the Communications Decency Act of 1996 with the aim to protect minors from explicit material on the Internet. The law criminalised the knowing transmission of obscene or indecent messages to recipients under 18 years of age. The statute was challenged before the United States Supreme Court in the case of ACLU v Reno, where the court held that there is a difference between Internet communication and the other forms of communication that the Supreme Court had earlier ruled on where First Amendment speech rights had been invoked. The court was of the opinion that the Communications Decency Act of 1996 lacked precision required under the First Amendment for regulation of the content of speech and it restricted the freedom of speech that adults have when less restrictive alternatives would be at least as effective in achieving the legitimate purpose that the statute was enacted to serve.

    The above discussed early decision on Internet regulation reflects on some important aspects of Internet as a mode of communication and the challenges that states may face in regulating it because the US Supreme Court invalidated the Communications Decency Act of 1996 in ACLU v Reno. Subsequent attempts to regulate Internet content with the view to protecting children were made in Child Online Privacy Protection Act of 1998. This Act restricts the online collection of personal information from children aged 13 or younger by platforms and require that platforms that maintain chat rooms directed at children must either condition a child's participation on the consent of a parent or guardian or monitor the chat room and censor references to personal


  30. Paul Schiff Berman, Law and Society approaches to cyberspace (Ashgate Publishing 2007).
  31. Andrew Murray, Information Technology Law: the law and society (Oxford University Press 2013).
  32. ACLU v Reno, 929 F. Supp. 824 (1996).
  33. Ibid, per Justice John Paul Stevens.
  34. Vashee, ‘ACLU v. Reno: Congress Places Speed Bumps on the Information Superhighway’, supra n 7.
  35. information. One of the questions that is raised in this context is whether the law infringes on the free speech rights of children.

    Indeed, the question of free speech rights is an important part of any discussion on Internet regulation because regulation of the Internet hinges on several aspects of individual liberty such as free speech as well as important aspects like equality, fairness, and human rights in general. Internet regulation then becomes an area that requires careful balancing of different interests. Where on the one hand, Internet is a space where individuals may face risks to their privacy and other interests, it is also a space for innovation, access to knowledge and information, and access to opportunities, which makes it a delicate act of balancing for the state. Critics of regulation therefore point to paternalistic attitudes towards individual freedom when state may make regulation that is seen to be impinging on free speech rights; this was seen in the case of Child Online Privacy Protection Act of 1998, which was considered by critics to be an infringement of children’s right to free speech, as well as the platforms’ right to commercial speech. The criticism hinges on the argument that in the case of children and the possibility of harm in online environments, it is the parents that must regulate the activities of the children and not the government. To go back to the ACLU v Reno judgment, the view of the US Supreme Court also was that a statutory provision that lays financial burden on the speakers because of the content of their speech, is presumptively inconsistent with the First Amendment free speech rights. Clearly, a paternalistic approach to regulation of intermediaries may lead to difficulties because the intermediaries cannot act as sole gatekeepers to adjudge speech rights and at the same time, a complete lack of regulation can lead to perverse outcomes for the rights of those who are harmed or whose rights are violated because of unregulated content.

    In the UK, the regulation of online content is done through a primary responsibility of the creator of content to ensure lawful content and the secondary responsibility of a platform operator to remove unlawful content from its website. The principal legislations that have relevance to regulation of the Internet are the Digital Economy Acts of 2010 and 2017 (although these do not provide a comprehensive review of content regulation), the Communications Act 2003 (although this does not include online content or platforms) and the E-Commerce Directive (2000/31/EC). The Draft Online Safety Bill relates to the liability of the intermediaries to meet certain standards and also subjects Ofcom (the proposed regulator) to regulatory obligations. The object of the Draft Online Safety Bill is to “make provision for and in connection with the regulation by OFCOM of certain internet services; and to


  36. Charlene Simmons, ‘Protecting Children While Silencing Them: The Children's Online Privacy Protection Act and Children's Free Speech Rights’ (2007) 12(2) Comm L & Pol'y 119.
  37. Ibid.
  38. Anita L Allen, ‘Minor Distractions: Children, Privacy and, E-Commerce’ (2001) 38 Houston L. Rev. 751.
  39. Melanie L Hersh, ‘Is COPPA a Cop Out? The Child Online Privacy Protection Act as Proof that Parents, Not Government, Should Be Protecting Children's Interests on the Internet’ (2001) 28 Fordham Urb. L.J. 1831.
  40. Ibid.
  41. ACLU v Reno, 929 F. Supp. 824 (1996).
  42. Lovells, Liability regulation of online platforms in the UK: A White Paper, supra n 9.
  43. make provision about and in connection with OFCOM’s functions in relation to media literacy.” The EU E-Commerce Directive (Directive 2000/31/EC) also provides for liabilities that arise out of the functioning of networks and is relevant to the regulation of the intermediaries. The European Court of Justice (CJEU) has considered the issue of intermediary liability in the case of Peterson v Google, where the question before the court was whether Google could be held liable in damages, and be subject to an injunction, for hosting on YouTube videos containing copyright-infringing material. The court’s decision was that the operator of a platform is allowed the protections under the E-Commerce Directive unless they have the requisite wrongful knowledge in connection with its hosting of copyright-infringing material. Therefore, an important component to intermediary liability is the wrongful knowledge otherwise the intermediary enjoys protection of the E-Commerce Directive (2000/31/EC). Under the Directive, intermediaries have protection and are liable for illegal content only if they have ‘actual knowledge’ of it and have failed to act ‘expeditiously’ to remove or block it.

    The intermediaries have what is called as a safe harbour in the law and the first law to provide them with this safe harbour is Section 230 of the Communications Decency Act 1996, an American statute, where it was held that: “no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” One of the criticisms of this safe harbour approach is that while it facilitated the growth of Internet in the early stages, it also facilitated the growth of an illicit digital economy; this is expressed as follows:

    “Passed in 1996, the CDA (Communications Decency Act 1996) was an attempt by Congress to accommodate competing values and facilitate an uncertain but promising future digital world. Since that time, this digital world has changed drastically. Some argue that § 230 is in part responsible for the growth of the digital economy and the “Internet as we know it.” Others argue that the “Internet as we know it” is not what we want it to be, particularly when it comes to sex trafficking, pornography, child sex-abuse images, and exploitation. It is clear that, whatever § 230 did for the legitimate digital economy, it also did for the illicit digital economy.”

    The case law and jurisprudence developed in the aftermath of the passage of Section 230 in the United States demonstrates the challenges that courts face in affixing liability on platforms that permit illicit activities, such as, sex trafficking, which has also led courts to call for legislative action so that there is limited protection to platforms. It is important to note however, that Section 230 was


  44. Draft Online Safety Bill, Objective, accessed p. 2.
  45. Cases C-682/18 and C-683/18, Frank Peterson v Google LLC, YouTube Inc., YouTube LLC, Google Germany GmbH (C-682/18) and Elsevier Inc. v Cyando AG (C-683/18).
  46. Ibid.
  47. Mary Graw Leary, ‘The indecency and injustice of Section 230 of the Communications Decency Act’ (2018) 41 Harv. JL & Pub. Pol'y 553, p. 554.
  48. Jane Doe No. 1 v. Backpage.com, LLC, 817 F.3d 12, 29 (1st Cir. 2016); M.A. v. Vill. Voice Media Holdings, 809 F. Supp. 2d 1041, 1058 (E.D. Mo. 2011).
  49. enacted in the aftermath of the decision of the New York Superior Court in Stratton Oakmont, Inc. v. Prodigy Services Co in 1995, where the court held that the platform was liable for information published by it on the ground that it voluntarily deleted some messages and therefore was responsible for the content that it failed to delete. The decision of the court was superseded by the enactment of Section 230. Therefore, the intent of the law is clear that it seeks to provide protection to platforms for the content that is published by them. The difficulty is that the proposition that platforms should have such protection from legal liability does not align with the fact that platforms and intermediaries moderate content and choose the kinds of content that gets promoted over the others.

    Another law that is used in the United States to limit the liability of the intermediaries is the 1998 Digital Millennial Copyright Act (DCMA), which immunises digital businesses in the event of users posting copyrighted content if the business has taken measures to take down the content when notified of the copyright violation. This also indicates the tilt of American law towards the protection of the platform. The ground on which the New York Superior Court held the platform liable in Stratton Oakmont, Inc. v. Prodigy Services Co does raise a relevant issue that if the platform has the power to moderate content, then why should it not be responsible for failing to delete content that is illegal? This question becomes even more important to engage with when it is considered that content may even be harmful to individuals or even criminal in some cases.

    Order Now

    The question of regulation of Internet raises important questions on what kind of behaviour or content needs to be regulated and for what purposes. One of the important aspects of this discussion relates to harms that individuals may be exposed to in online environments. For instance, children have been reported to be at risk of harm due to their online behaviour and access to certain kinds of sites and content on the Internet, which calls for devising of means for reducing such harms. Children are also exposed to cyberbullying, which is a phenomenon that is reported to be increasing even for children of young ages. Another phenomenon, which has been a cause of concern not just with regard to minors, but adults as well is that of trolling on the online platforms. Trolling has been defined as “utterer producing an intentionally false or incorrect utterance with high order intention [the plan] to elicit from recipient a particular response, generally negative or violent.” While there are a number of legislative measures in place to respond to individual acts of trolling, such as, Sexual Offences Act 2003, Communications Act 1988, Criminal Justice and


  50. Stratton Oakmont, Inc. v. Prodigy Services Co, No. 31063/94, 1995 WL 323710 (N.Y. Sup. Ct. 1995).
  51. Zeran v. Am. Online, Inc., 129 F.3d 327, 331 (4th Cir. 1997).
  52. David S Evans, ‘Deterring bad behaviour on digital platforms’ in David S Evans, Allan Fels, and Catherine Tucker (eds.), The Evolution of Antitrust in the Digital Era: Essays on Competition Policy (Boston: Competition Policy International 2020).
  53. S Livingstone and M Bober, ‘UK children go online : surveying the experiences of young people and their parents’ (2004) LSE Research Online.
  54. J Wang, RJ Iannotti, TR Nansel, ‘School bullying among adolescents in the United States: Physical, verbal, relational, and cyber’ (2009) 45(4) Journal of Adolescent Health 36.
  55. Scott Thacker and Mark D Griffiths, ‘An exploratory study of trolling in online video gaming’ (2012) 2(4) International Journal of Cyber Behavior, Psychology and Learning (IJCBPL) 17.
  56. Courts Act 2015, Communications Act 2003 and Malicious Communications Act 1988, there is little legislative action on regulation of platforms.

    In the United States, to go back to Section 230 of the Communications Decency Act 1996, the intent of the provision is to do two things: first, to provide some regulation of online platforms in the context of federal criminal laws (Section 230(5)); and second, to provide free market space to the Internet and digital economy. The latter is depicted in clause (2) of Section 230 which asks to “preserve the vibrant and competitive free market that presently exists for the Internet and other interactive computer services, unfettered by Federal or State regulation.” The call to provide free market “unfettered by Federal or State regulation”, reflects on the principal intent of the law being to minimise regulation of Internet. However, is this intent also reflected in the laws regulating platforms or intermediaries in the UK?

    In the UK, there is a growing consensus on the need to find more effective means for regulating intermediaries; for instance, the UK Parliament has stated that in the changing digital world, the existing legal framework is no longer fit for purpose. A recommendation is also made to appoint a regulator, such as, UK Information Commissioner and communications regulator Ofcom, who is tasked to combat disinformation directly by licensing of content providers and their systems for content moderation. The argument that there should be such state regulation of online content through appointed regulators is that government should not impose the judgement exercise of regulating online content on “online intermediaries, who are inexpert in and not incentivised to judge fundamental rights, and not bound by States’ international human rights commitments.” In the UK, which is bound by European Convention of Human Rights, Article 10 provides the freedom of expression and also lists the restrictions that governments can impose on the freedom of expression. This engenders human rights related to expression. If the online intermediary is only responsible for moderating online content, then there is a possibility that the online intermediary would restrict speech in the name of regulation. However, the point made by those who argue for statutory regulation is that online intermediaries cannot be made responsible for regulating content when this also involves judging human rights, which is not the task of the intermediaries but the government.

    At the very least, the issue of platform regulation raises questions involving conflicting values of fairness, equality, innovation, human rights, and public interest. The question is how laws in countries like the United Kingdom, United States and other European countries have resolved these conflicting values. In

    UK House of Commons, Interim Report on Disinformation and ’Fake News’, Select Committee on Media, Culture and Sport (2018), accessed

    UK Information Commissioner’s Office, Democracy Disrupted? Personal Influence and Political Influence (2018), accessed , Recommendation 10.

    Chris Marsden, Trisha Meyer, and Ian Brown, ‘Platform values and democratic elections: How can the law regulate digital disinformation?’ (2020) 36 Computer Law & Security Review 105373, p. 2.

    the United States, there is a tilt towards free market principles which is reflected in the safe harbour provision of Section 230 of Communications Decency Act 1996. In Europe, the European Commission noted that there are many regulatory challenges that are posed by online platforms and for addressing this, both legislative intervention as well as principles-based self-regulatory/coregulatory measures are required to provide monitoring mechanisms, with more emphasis on the latter. This means that there may be preference given to self-regulation based on principles-based framework rather than adopting an approach that emphasises on government regulation. However, the question is how this approach addresses conflicting values of fairness, equality, innovation, human rights, and public interest. It has been considered that when online intermediaries are allowed more self-regulation, counterbalances that safeguard public interests are also required. The European Commission has noted that market-based solutions to addressing regulation may not be adequate “to ensure fair and innovation-friendly results, facilitate easy access for new market entrants and avoid lock-in situations.” There is difficult question related to digital economy, which was no doubt faced by the US Congress legislators when they enacted Section 230 of the Communications Decency Act 1996, in that how to balance the needs of innovation in digital economy with the needs of protection of rights. In the United States, the emphasis is on free market.

    In European Union, there is also a discussion with respect to public interest and human rights. In the context of public interest, it has been considered that self-regulation would better serve public interest because:

    “first, that the activity is afflicted by some form of market failure, notably externalities or information asymmetries; secondly, that private law instruments are inadequate or too costly to correct the failure; and, thirdly, that self-regulation is a better (cheaper) method of solving the problem than conventional public regulation.”

    In the UK, the Communications Act 2003 placed the obligation on the convergent media regulator Ofcom to interpret public interest in terms of individual consumer welfare in matters relating to networks and services. This has not translated into online intermediaries’ regulation by Ofcom. However, in the UK, there is some evidence of movement away from self-regulation towards more government regulation. The UK government’s Internet Safety Strategy of 2017 is reflective of a tilt towards legally-driven and law-enforcement based


  57. European Commission, ‘Communication on Online Platforms and the Digital Single Market. Opportunities and Challenges for Europe’ (2016) COM 288.
  58. Michèle Finck, ‘Digital co-regulation: designing a supranational legal framework for the platform economy’ (2018) European Law Review, accessed
  59. European Commission, ‘Building a European Data Economy’ (2017) COM 9, p. 10.
  60. Anthony Ogus, ‘Rethinking Self-Regulation’, in Robert Baldwin, Colin Scott and Christopher Hood (eds) A Reader on Regulation (Oxford University Press 1998) p. 374.
  61. Terry Flew, Fiona Martin, and Nicolas Suzor, ‘Internet regulation as media policy: Rethinking the question of digital communication platform governance’ (2019) 10(1) Journal of Digital Media & Policy 33.
  62. Majid Yar, ‘A failure to regulate? The demands and dilemmas of tackling illegal content and behaviour on social media’ (2018) 1(1) International Journal of Cybersecurity Intelligence & Cybercrime 5.
  63. approach. The Parliamentary Committee has also recommended this shift in its 2017 report. This shift must be seen in conjunction with the policy decisions to use Ofcom for focus on ‘Online Harms’ (through Ofcom) and the Digital Markets Unit for Competition, which reflects on the direction in which regulation in in the UK is proceeding with use of government regulatory bodies to regulate online platforms. In the UK, courts have been more willing to act in case of affixing liability against intermediaries as demonstrated in cases like Shetland Times v Willis, and Godfrey v Demon Internet Ltd. In Shetland Times v Willis, the court allowed a temporary restraint barring the defendants from copying headlines from a newspaper onto their web site, and creating hyperlinks to the location on the pursuer's site, which allowed the users to bypass the pursuer's home page, and took the user directly to the article in question. In Godfrey v Demon Internet Ltd, the High Court of England and Wales refused to allow a UK-based Internet Service Provider (ISP) to take a defence against a defamation action in a case where an unknown Internet user created an obscene and defamatory posting and fraudulently attributed its authorship to a UK professor. The UK-based ISP, was held liable for defamation for failure to remove the posting for more than 20 days on the ground that the ISP knew or had reason to know that the impugned statement was defamatory once the notification was given to it. These cases demonstrate a greater engagement with the balancing of interests and values that are involved in the regulation of the online platforms in the UK.

    In conclusion, the regulation of the Internet is fraught with dilemmas and issues surrounding the conflicting values of free market and innovation of the digital economy on the one hand and the protection of rights of individuals, including children, who may be exposed to different kinds of harms and rights violation on the Internet. Non-regulation of the Internet is not an option considering that this is a market that cannot be left to regulate on its own. Intermediaries cannot be the sole gatekeepers because they cannot be the appropriate judges of human rights and constitutional rights of the users. Some forms of self-regulation can be useful in creating systems of regulation that can be cheaper to implement and also effective in responding to illicit and illegal content on the platforms. It cannot be the sole method because for certain aspects, there is a need to use governmental regulation. The recent UK government approach is showing a tilt towards using a combination of government regulatory mechanisms,

    particularly with the Ofcam and the Digital Markets Unit, and self-regulation with the inputs from intermediaries. It is submitted that this approach is likely to be more effective that the near absolute immunity that is seen in the United States with respect to intermediaries or platforms. Any regulation in the form of


  64. Department for Digital, Culture, Media & Sport, Internet Safety Strategy green paper (London: HMSO 2017).
  65. Committee on Standards in Public Life, Intimidation in public life: A review by the committee on standards in public life (London: HMSO 2017).
  66. Martin Kretschmer, Philip Schlesinger and Ula Furgal, ‘The emergence of platform regulation in the UK: an empirical-legal study’ (2021) accessed < https://zenodo.org/record/4884877#.YdbLbpMzZ8d>
  67. Shetland Times v Willis [1997] SC 316.
  68. Godfrey v Demon Internet Ltd [2001] QB 201.
  69. Shetland Times v Willis [1997] SC 316.
  70. Godfrey v Demon Internet Ltd [2001] QB 201.
  71. statutes or policy should also clarify the meaning of platform or intermediaries so that it is clearer as to whom the liability is affixed to. This is not the case at this point. Finally, it is important to reiterate that the issue of regulation of intermediaries involves questions of conflicting values that need a balanced approach, which can be provided by using a combination of self-regulation and formal regulation.

Cases

ACLU v Reno, 929 F. Supp. 824 (1996).

C-682/18 and C-683/18, Frank Peterson v Google LLC, YouTube Inc., YouTube LLC, Google Germany GmbH (C-682/18) and Elsevier Inc. v Cyando AG (C-683/18).

Godfrey v Demon Internet Ltd [2001] QB 201.

C-324/09, L’Oréal SA and Others v eBay International AG

Jane Doe No. 1 v. Backpage.com, LLC, 817 F.3d 12, 29 (1st Cir. 2016).

M.A. v. Vill. Voice Media Holdings, 809 F. Supp. 2d 1041, 1058 (E.D. Mo. 2011).

Shetland Times v Willis [1997] SC 316.

Stratton Oakmont, Inc. v. Prodigy Services Co, No. 31063/94, 1995 W< 323710 (N.Y. Sup. Ct. 1995).

Zeran v. Am. Online, Inc., 129 F.3d 327, 331 (4th Cir. 1997).

Books

Denardis L, Global War for the Internet Governance (Yale University Press 2014).

Evans DS, ‘Deterring bad behaviour on digital platforms’ in David S Evans, Allan Fels, and Catherine Tucker (eds.), The Evolution of Antitrust in the Digital Era: Essays on

Competition Policy (Boston: Competition Policy International 2020).

Kettemann MC, The Normative Order of the Internet: A Theory of Rule and Regulation Online (Oxford University Press 2020).

Murray A, The regulation of cyberspace: control in the online environment (Routledge 2007).

Ogus A, ‘Rethinking Self-Regulation’, in Robert Baldwin, Colin Scott and Christopher Hood (eds) A Reader on Regulation (Oxford University Press 1998).

Reed C, Law: Text and Materials (Cambridge University Press 2004).

Reed C and Andrew Murray, Rethinking the Jurisprudence of Cyberspace (Edward Elgar 2018).

Riordan J,The liability of internet intermediaries (Oxford University Press 2016).

Schjolberg S, An International Criminal Court or Tribunal for Cyberspace. A paper for the EastWest Institute (EWI) Cybercrime Legal Working Group (EastWest Institute 2011).

Journals

Allen AL, ‘Minor Distractions: Children, Privacy and, E-Commerce’ (2001) 38 Houston L. Rev. 751.

Berman PS, Law and Society approaches to cyberspace (Ashgate Publishing 2007). Boyle J, ‘Foucalt in Cyberspace: Surveillance, Sovereignty and Hardwired Sensors’ (1997) 66 U. Cin. L. Rev. 177.

Flew T, Fiona Martin, and Nicolas Suzor, ‘Internet regulation as media policy: Rethinking the question of digital communication platform governance’ (2019) 10(1) Journal of Digital Media & Policy 33.

Hersh ML, ‘Is COPPA a Cop Out? The Child Online Privacy Protection Act as Proof that Parents, Not Government, Should Be Protecting Children's Interests on the Internet’ (2001) 28 Fordham Urb. L.J. 1831.

Johnson DR and D Post, ‘Law and borders: the rise of law in cyberspace’ (1996) 48(5) Stanford Law Review 1367.

Leary MG, ‘The indecency and injustice of Section 230 of the Communications Decency Act’ (2018) 41 Harv. JL & Pub. Pol'y 553.

Lessig L, ‘The Law of the Horse: What Cyber Law Might Teach’ (1999) 113(501) Harvard Law Review 506.

Livingstone S and M Bober, ‘UK children go online : surveying the experiences of young people and their parents’ (2004) LSE Research Online.

Marsden C, Trisha Meyer, and Ian Brown, ‘Platform values and democratic elections: How can the law regulate digital disinformation?’ (2020) 36 Computer Law & Security Review 105373.

Murray A, Information Technology Law: the law and society (Oxford University Press 2013).

Netanel NW, ‘Cyberspace Self-Governance: A Skeptical View from Liberal Democratic Theory’ (2000) 88 Calif L. Rev. 395.

Reidenberg JL, ‘Lex Informatica: The Formation of Information Policy Rules Through Technology’ (1998) 76 (3) Texas Law Review 553.

Simmons C, ‘Protecting Children While Silencing Them: The Children's Online Privacy Protection Act and Children's Free Speech Rights’ (2007) 12(2) Comm L & Pol'y 119.

Stromdale C, ‘Regulating Online Content: A Global View’ (2007) 13 Computer and Telecommunications Law Review 173.

Thacker S and Mark D Griffiths, ‘An exploratory study of trolling in online video gaming’ (2012) 2(4) International Journal of Cyber Behavior, Psychology and Learning (IJCBPL) 17.

Vashee DS, ‘ACLU v. Reno: Congress Places Speed Bumps on the Information Superhighway’ (2000) 6(3) Richmond Journal of Law & Technology 16.

Wang J, RJ Iannotti, TR Nansel, ‘School bullying among adolescents in the United States: Physical, verbal, relational, and cyber’ (2009) 45(4) Journal of Adolescent Health 36.

Yar M, ‘A failure to regulate? The demands and dilemmas of tackling illegal content and behaviour on social media’ (2018) 1(1) International Journal of Cybersecurity Intelligence & Cybercrime 5.

Reports

Committee on Standards in Public Life, Intimidation in public life: A review by the committee on standards in public life (London: HMSO 2017).

Department for Digital, Culture, Media & Sport, Internet Safety Strategy green paper (London: HMSO 2017).

European Commission, ‘Communication on Online Platforms and the Digital Single Market. Opportunities and Challenges for Europe’ (2016) COM 288.

European Commission, ‘Building a European Data Economy’ (2017) COM 9, p. 10.

Others

Barlow JP, ‘Declaration of the Independence of Cyberspace’ (1996) accessed

Bunting M, ‘Keeping Consumers Safe Online’ (2018) accessed

Council of Europe, Recommendation on the roles and responsibilities of internet intermediaries (Recommendation CM/Rec(2018), 2 March 2018) accessed

Finck M, ‘Digital co-regulation: designing a supranational legal framework for the platform economy’ (2018) European Law Review, accessed

Lovells H, Liability regulation of online platforms in the UK: A White Paper (April 2018) accessed < https://www.hoganlovells.com/~/media/hogan-lovells/pdf/2018/google-online-platforms-white-paper.pdf>.

Shattock E, ‘Self-regulation 2: 0? A critical reflection of the European fight against disinformation’ (2021) Harvard Kennedy School Misinformation Review, accessed

UK House of Commons, Interim Report on Disinformation and ’Fake News’, Select Committee on Media, Culture and Sport (2018), accessed

UK Information Commissioner’s Office, Democracy Disrupted? Personal Influence and Political Influence (2018), accessed .

Sitejabber
Google Review
Yell

What Makes Us Unique

  • 24/7 Customer Support
  • 100% Customer Satisfaction
  • No Privacy Violation
  • Quick Services
  • Subject Experts

Research Proposal Samples

It is observed that students take pressure to complete their assignments, so in that case, they seek help from Assignment Help, who provides the best and highest-quality Dissertation Help along with the Thesis Help. All the Assignment Help Samples available are accessible to the students quickly and at a minimal cost. You can place your order and experience amazing services.


DISCLAIMER : The assignment help samples available on website are for review and are representative of the exceptional work provided by our assignment writers. These samples are intended to highlight and demonstrate the high level of proficiency and expertise exhibited by our assignment writers in crafting quality assignments. Feel free to use our assignment samples as a guiding resource to enhance your learning.

Live Chat with Humans
Dissertation Help Writing Service
Whatsapp