The information security of children: Self-regulatory approaches

The information security of children: Self-regulatory approaches

DOI: 10.11621/pir.2014.0312

Vartanova, E.L. Lomonosov Moscow State University, Moscow, Russia

Tolokonnikova, A.V.

Cherevko, T.S.

Abstract

The 21st century has been characterized by tremendous changes in mass-media systems. The rapid growth of the Internet, inspired by the progress of communication technologies and digitalization, has resulted in the rise of new interactive media. Developments contributing to the scope and speed of media production and distribution have drawn particular attention to the information security of audiences – in particular, to protecting children from content that might be harmful and not appropriate for their age. Unlike adults, who are accustomed to living in an information-rich society, children cannot understand and filter content. Digital media, with their profound effects on a young audience, definitely affect children’s psychology and emotions.

Recognizing this development, the most economically advanced countries have elaborated specific media policies to ensure that children receive the advantages of new media and simultaneously are kept safe from harmful content. These policies, aimed at traditional media (press and analogue broadcasting), have been based on legal approaches, but in digital reality laws do not always produce the same desired effects because the law-making process often does not keep up with technological change. Governments, therefore, have to share their responsibilities with the nongovernmental – private business and civil– sectors. Even countries with strong government influence over public life, such as Singapore, are working toward a co-regulated and self-regulated mass-media industry. Many foreign countries, including those in Western Europe, North America, and Asia, already have experience with these policies.

The article reviews practices in the field of media aimed at guaranteeing children’s information security and at opposing harmful content. It points to key aspects of the regulation of market-driven media content in different countries.

Received: 21.06.2014

Accepted: 20.08.2014

Themes: Media and cyber psychology; Security psychology

PDF: http://psychologyinrussia.com/volumes/pdf/2014_3/2014_3_136-145.Pdf

Pages: 136-145

DOI: 10.11621/pir.2014.0312

Keywords: mass media, children’s information security, children and the mass media, self-regulation of the mass media, Internet regulation

Introduction

The digital revolution: New opportunities and new problems

The previous decades have brought significant changes in perceptions of children’s safety in their access to information. Technological progress, by inspiring a rapid development of the Internet, radically changed the traditional media environment. Digital technologies “mixed” conventional media: TV programs today might be easily watched on the screens of personal computers and tablets; connected TV sets become “windows” to the Internet; and a traditional newspaper text is transformed into a convergent multimedia product combining video and audio.

The digital revolution has opened endless possibilities for creating and dissemi­nating news and entertainment content. However, it has also posed new problems for mass media and audiences by increasing the amount of accessible information and fostering competition between the old/analogue and new/digital media; by challenging existing legislation, business models, and copyrights; and by disregard­ing the traditional roles of journalism and the values of audiences. At the core of the crucial issues of the digital age stands the problem of children’s information security: the need to secure children’s rights to safe media and a safe Internet in order to prevent serious threats to their psychological health. Unlike adults, who have gotten used to living in an information-rich environment, children, because of their age and social position, remain vulnerable to intense media and are un­able to filter the content coming from different media sources and technological platforms, which are always increasing their influence on the younger audience (Fenton, 2010; Tornero & Varis, 2010).

TV programs, net video resources, digital games, and other new media options constitute a big threat today because they are the most accessible types of content. In the analogue era parents could limit children’s access to media by hiding inap­propriate publications, but it becomes rather complicated to control what children watch on multichannel TV sets or multi-screen media devices in digital reality to­day.

Unattended interaction between children and television or the Internet may cause serious harm to their health and age-specific development. Realizing this threat, the majority of North American, European, and Asian countries, includ­ing Russia, have taken a number of measures to regulate media content targeted to children. Different types of harassments, child prostitution, child pornography, drugs, production of explosives, demonstrations of violence—these are some of the kinds of media content countries are trying to prevent children from watching (Roskomnadzor, 2013).

At the same time surveys of children, who spend more and more time on the Internet, show that they often face other threats, of which many parents are un­aware. One example is the so-called cyber-bullying phenomenon, which involves deliberate assaults, threats, use of offensive language, mocking, and swearing on the Internet. In some countries - for example, Canada - the struggle against cyber- bullying stands at the core of children’s information-security policy. However, in other countries, including Russia, the state and media do not pay enough attention to it, although, for example, about 11% of the users of the most popular Russian social network, Vkontakte, are younger than 18 (TNS Web Index, 2014).

One of the main current problems is that in many countries legislative initia­tives in the sphere of Internet regulation lag behind the rapid development of in­formation and communication technologies. In previous decades the protection of children’s interests in the mass media included a comprehensible set of measures: restricting children’s access to adult movies in the cinema, not showing action movies on television in the daytime, hiding adult magazines in special packages. However, today, because of the rapid progress of the digital media, many of the established measures have become obsolete, and misunderstanding exists even in defining the agencies that should be responsible for such regulation.

Nowadays one regulatory body often integrally controls the information tech­nology, media, and telecommunications “ecosystem” (De Prato, Sanz, & Simon, 2014). This is the case in the United States, Canada, the Republic of Korea, and Japan, for example. Singapore has a single watchdog for all media; it was established by merging three separate regulatory bodies. Russia is also moving in the same di­rection since the establishment of the Ministry of Communications and Mass Me­dia and its controlling agency Roskomnadzor (the Federal Service for Supervision of Communications, Information Technology and Mass Media). Thus, the ongoing convergence of media technologies has resulted in the merger of state agencies that regulate previously independent mass-media sectors.

However, state regulation of the traditional media and the Internet cannot al­ways secure control over content in a way that is effective and relevant to public demand. That is why a movement toward co- and self-regulation of the media and journalism is topical right now, especially the new media (Vartanova, 2006). The efficiency of such control mechanisms is vividly illustrated by the experience of Canada. There, government policy in the sphere of the Internet has always been oriented toward limiting state interference and encouraging self-regulation. For this reason the challenge of new media has become a spur for the development of public initiatives and self-regulatory organizations. As a result of their activities, for instance, in Canada a system for filtering Internet content has been created. It appears to be rather efficient in blocking harmful content compared with the sys­tems designed in China and the Republic of Korea under conditions of strong state control (Roskomnadzor, 2013).

The movement toward self-regulation seems to be a universal trend even for countries with strong state influence in the media, where the governments try to share responsibilities in regulating new technologies with the nongovernmental sector. This is especially true for Singapore, a country with total state regulation.

Self-regulation of the media: main features

Non-legislative regulation of the media has emerged under the influence of different factors, including the concept of democracy and freedom of speech, but has been implemented in different ways because of state structure, level of democratic devel­opment, historical prerequisites, and cultural, religious, and moral environments (Fedotov, 2009).

In one way or another, self-regulatory practices allow the professional commu­nity in many countries to define independently the rules according to which mass media operate; these rules are based on the principles of public well-being, journal­istic professionalism, and ethics. In some cases the need to form internal standards emerges inside the journalistic community in a voluntary way, without any kind of interference from the state or other external societal forces. In other cases authori­ties encourage the emergence of a self-regulation institute, using the support of “decent” media outlets, or adopt laws that oblige the journalistic community to cre­ate self-regulatory bodies and allow them to develop documents regulating the ac­tivities of the mass media; such arrangements are called “legislative self-regulation” (Roskomnadzor, 2013).

In countries such as Denmark, Germany, France, and Lithuania the main prin­ciples of self-regulation are prescribed in the laws, while their elaboration, specifica­tion, and practical realization are the responsibility of representatives of the media. This approach is based on the concept of a “legislative framework” (Roskomnadzor, 2013). Sometimes journalists and the state create a regulatory body in the sphere of mass media together; an example of such an approach is the independent regulator and competition authority for the UK communication industry, Ofcom (Office of Telecommunications) in Great Britain.

Practice shows that the participation of the state in the creation of mass-media self-regulation institutes is often justified. The history of the emergence and de­velopment of voluntary self-regulatory bodies in the mass media of countries in Western Europe indicates that in conflict situations between the society and the mass media the state played a key role in solving conflicts and relieving tensions. The experience in Western Europe proves that the region might be considered a pioneer of self-regulation. Based on media practice in many countries, the initia­tors of self-regulation at present include the following factors:

  • media enterprises (internal guidelines, media critique)
  • corporate journalistic organizations (codes of professional ethics)
  • organizations/professional associations of media owners and managers (the business community) (unwritten codes of conduct, labeling system for audiovisual content)
  • the advertising industry (codes for advertising activities)
  • public organizations of a mixed nature (press councils, industry ombuds­man)
  • the academic community (mid-career training programs, research)
  • the audience itself (nongovernmental organizations, associations for the protection of consumer rights, organizations of parents and teachers) (Vartanova, 2009, p. 162).

Media self-regulation in most cases has been organized as a nongovernmental independent system; this arrangement ensures that the media are accountable for their activities not to the state but to society. Legal responsibility in the system is replaced by ethical responsibility based on moral and cultural norms.

In the media business self-regulation has become a voluntary recognition of its social responsibility to safeguard the conditions of legislatively guaranteed freedom of speech. In return for their independence from the state the media companies in their operations have assumed certain duties to establish a dialogue with the soci­ety, to react in due time to its needs, and not to neglect the coverage of issues that raise public concern.

Within a mature self-regulation system, making a claim to self-regulatory bod­ies usually does not require any particular efforts or financial expenses from the complainant and results in a quite fast resolution of the problem without bureau­cratic delays. In addition, tackling an issue connected with correcting factual er­rors or violating human rights within the framework of the journalistic community decreases the work load of the court system of the country.

All societal stakeholders accept that in this case the criticism of irresponsible journalists comes not from the state or its agencies but from the journalistic community. Representatives of the media industry and professional communities should create quality standards and set the required boundaries. The mechanism of complaints submission to voluntary media bodies and ombudsmen helps in many ways to eliminate the disadvantages of content control and to sort out controversial issues. An objective consideration of a complaint by the professional community allows it to make an error correction. This mechanism also guarantees that self-regulation stays outside politics and encourages conducting internal control of the mass media with the support of civil society.

In a number of states in continental Europe self-regulation allows effective control of the content of printed and audiovisual materials without state interfer­ence. This mechanism is also ensured by the fact that journalists, editors, and other members of the media business with practical experience have a good understand­ing of the information needs of their audience, possess an insider’s knowledge of the situation in the market, and know the hidden hazards in the media industry. Government officials, who observe media activities from the outside, do not pos­sess this kind of knowledge.

The reasons for self-regulation are numerous. The professional journalistic community itself has a profound interest in creating an effective system of self-reg­ulation because it gives media representatives some clear reference points in their work for using professional norms and ethics. Besides, extra-legislative restrictions seem too tough and inefficient for Western European and North American mass media. When freedom of speech is well established, the press and other media may fight for abolishing the restrictive clauses only if they convince society of their con­scientiousness and responsibility. There are also other reasons for the development of self-regulation in the media sphere. The Organization for Security and Cooperation in Europe defines five main motives: (1) self-regulation preserves editorial freedom; (2) it helps minimize state interference; (3) it encourages enhancing the quality of the mass media; (4) it serves as proof of the responsibility of the mass media; (5) it encourages audience access to the mass media (Haraszti, 2008).

A developed self-regulation system, on the one hand, increases the quality of journalists’ work and, on the other hand, protects the interests of the mass media, providing them with competent help in solving conflict situations. As a result, it is a useful instrument both for mass-media companies as content creators and for the audience as consumers of media content.

Best practices of contemporary media self-regulation to protect children

The wide distribution of digital media makes it possible that in many countries at­tempts to fight legally against media content that could be harmful to minors might be ineffective. Current methods of receiving media content are making the user less connected to particular geographical locations and nation-states. Media globaliza­tion makes internationally adopted rules and instruments for protecting children from undesirable content a topical issue.

To protect children from harmful content many countries have elaborated spe­cific strategies using experience with self-regulation mechanisms already estab­lished in the media. The most widespread measures are content-labeling systems, “watershed” (or “safe-harbor”) systems, and V-chip technology.

Content-labeling systems can be considered the most developed widely used tool for marking media content with regard to age differences. It is also a method of dividing TV content into “programs for everyone” and “adult programs” (this method is usually called “watershed”). Content labeling is a formal indicator of the audience a certain information product or media content is aimed at. Such label­ing indicates that the product may be harmful for audiences who have not reached the age indicated on the label or is too complicated for them to understand. In European countries special regulatory bodies deal with content labeling. In Great Britain this is the task of the Committee for Film Classification; in Germany, the Federal Service for Checking Information for Youth; in Austria, the Commission for Protecting Children’s and Youth’s Rights; in Latvia, the Expert Commission un­der the auspices of the Culture Ministry.

The decisions of these bodies in different countries vary. An example of such differences is the classification results of the independent European system Kijkwijzer (from the Dutch expression meaning “watch wisely” or “video guide”), which labels all kinds of content: television, cinema, DVDs, video games, mobile TV, and others. Thus, the movie The Wolf of Wall Street in Great Britain is recommended for people from 18 years; in Germany, from 16; in Sweden, from 15; and in France it can be watched by children 12 and older (NICAM/ Kijkwijzer, 2014).

Table 1. “Watershed” time for TV channels in selected countries

Country

Time of “watershed”

Comments

Great Britain

5:30 am-9:00 pm

Ban on programs showing erotic content or violent scenes

France

6:00 am-10:30 pm

Ban on programs showing erotic content or violent scenes

Until 8:30 pm commercials for such programs are banned

United States

6:00 am-10:00 pm

Ban on programs showing erotic content or violent scenes

Canada

6:00 am-9:00 pm

Ban on programs showing erotic content or violent scenes

Italy

7:00 am-10:30 pm

Ban on programs and films “14+”

Australia

5:00 am-9:30 pm

Ban on programs and films “15+”

Germany

5:30 am-8:00 pm

Ban on programs and films “12+”

 

5:30 am-11:00 pm

Ban on programs and films “16+”

The Netherlands

6:00 am-8:00 pm

Ban on programs and films “12+”

 

6:30 am-10:00 pm

Ban on programs and films “16+”

 

A “watershed,” or “safe-harbor,” system sets a special time slot during which all programs, excluding those that are broadcast under special conditions, must meet general criteria and be suitable for a general audience, including children. The “wa­tershed” time is spelled out in a special law. There is an exception for some paid TV channels if they are accessible only via pin-code.

These rules are currently followed by the TV channels almost without violation, and self-regulatory bodies usually control compliance with the rules. However, in some cases concrete sanctions have been enforced against violators. For example, in 2004 CBS TV had to pay a fine of 3.63 million dollars for showing scenes with violations during the watershed time. These rules are in most cases a recommen­dation for parents. They decide whether to allow their children to watch labeled films or films broadcast during the “watershed” time. These methods do not work without control from the parental side.

V-chip technology is a tool for regulating television content for children regardless of whether they are alone in front of the screen or with their parents. In the United States a rule adopted by the Federal Communication Commission mandates that all TV sets with screens wider than 13 inches be equipped with special devices that al­low viewers to block the broadcasting of programs by using age-labeling technology known as the violence chip (V-chip) (Federal Communications Commission, 2007, p. 32). The V-chip is used in the United States, Canada, and Brazil. It recognizes cod­ed information about a film or program category, and when violent or sexual scenes are shown, it dims the screen, thus protecting minors from information that is not appropriate for them. However, according to a TV Watch Survey, 88% of parents do not use the V-chip, despite the fact that this technology is available on almost all TV sets (Luntz, Maslansky Strategic Research & Hart Research, 2007).

Children’s safety on the Internet

The expert and academic communities widely recognize that the Internet has pro­found effects on the audience, both positive and negative. For young audiences it can be considered as both a powerful educational and pedagogic tool and as a major threat to their psychological safety and moral values. For this reason the creation of tools to regulate content flows on the Internet is nowadays extremely urgent, and in this field many agents, both state and nongovernmental, cooperate in diverse ways. This collaboration produces an extension of the self-regulation system—co-regulation mechanisms.

One of the interesting examples is the Safer Internet Program. The project is aimed at extending the rights of children and young people in the global network and at protecting them by raising the awareness level of users and fighting against dangerous and destructive content as well as illegal behavior on the Internet. Within the framework of the European Union (EU) program “Safe Internet” Safer Internet Centers were founded in 30 European countries in order to create conditions for the safe and responsible use of the Internet and mobile devices by children. Several types of centers based on a functional criterion can be distinguished:

  • awareness centers, which disseminate information and conduct campaigns and information meetings involving children, parents, teachers, and train­ers for the purpose of raising their awareness about the potential online risks for children and methods for safety control on the Internet
  • helplines, which give personal advice to children, parents, and teachers about methods of safety control on the Internet
  • hotlines, which accept messages about cases of detected illegal content on the Internet

The work of the hotlines is coordinated by a special international association of Internet lines, INHOPE. It includes all EU member states, the United States, Canada, Japan, and some other countries (in total 43 members in 2013). According to data from the association, in 2013, 1,210,893 messages were received regarding harmful or illegal content; 71% of the affected were children under 13 years. In Rus­sia the Centre for the Safe Internet and the Friendly Runet Fund are members of the organization. For the development of the Safer Internet Program during 2009-2013 the sum of 55 million euro was allocated (INHOPE, 2013).

The need to protect minors from harmful materials on the web was recognized in the late 1990s. In the beginning of the 2000s there was an attempt to label Inter­net content (in the same way as TV content). In the United States a standard for site labeling—RSACi (Recreational Software Advisory Council for the Internet)—was developed. From the start the idea was welcomed, and Microsoft joined the system (at that time the company controlled a significant share of the Internet-browser market).

However, with the further development of the Internet it became obvious that such labeling could not fulfill its aims. The system could not catch up with the quickness of the Internet, and the emergence of social networks and personal pages downgraded its effectiveness even more. It turned out to be impossible to develop the scheme on an international scale, as it did not take into account the cultural particularities of the users.

In addition, research done by the ВВС Guidance Content Labeling System showed that a label marking sexual materials or violent scenes attracted teenagers to a site. In 2009 the European Commission issued a report saying that labeling content on the Internet, unlike labeling other forms of media content (films, videos, games), was ineffective (Sparrow, Bazelon, & Jackson, 2009).

In 2009 the EU member states adopted a declaration entitled “Self-Regulation for a Better Internet for Children.” Many media and key Internet players have joined the declaration, including Google, Microsoft, Facebook, Yahoo, Deutsche Telecom, RTL Group, Samsung, Vivendi, Vodafone, MySpace (all in all, 21 companies). Ac­cording to the document, all Internet companies take the responsibility of devel­oping and building in safety technologies that allow parents to limit the access of children to undesirable content. For instance, the Google settings now include the option “Safe search,” which when activated enables users to choose from two vari­ants of filtration: either strict (filtering both indecent images and text) or moderate (filtering only indecent pictures).

The Russian search engine Yandex also has special tools for filtering “adult” content. The system provides two limiting options: the “safe” one, which deletes from the search results sites for adults if the user is not searching for them deliber­ately, and the “for children” option, which totally excludes from the search results sites that include swearing or pornography.

Microsoft, which traditionally fights against Internet threats, has embedded the function of “parent control” in the Windows 7 and Windows 8 packages. It allows limiting the time during which children can log into the system as well as limiting access to some applications. Microsoft has also made an attempt to control access to video games that may be harmful to children. The company uses tools to limit access to unsafe gaming programs; the tools are integrated into the company’s soft­ware. It is possible to use “parent-friendly” settings in the play sets X-box 360 and X-box LIVE.

Apart from safety settings, companies that have supported the declaration have also agreed to provide users with the possibility of informing the company about illegal or harmful content on the Internet. An example of such cooperation is pre­sented by the video hosting site YouTube. The service allows users to mark videos that might contain dangerous information (elements of violence, child pornogra­phy, interference in private life). A claim is automatically sent to an analytical cen­ter, where the video is reviewed by experts. If the content is harmful, it is deleted from public access, and its dissemination is sanctioned.

Conclusion

To conclude this analysis of the existing self-regulating practices for the new me­dia aimed at children, it might be argued that even though many countries have been doing a lot to create universal and effective tools for protecting minors from harmful information, this goal has not yet been successfully achieved. Technologies for disseminating digital content have been developing much faster than methods for limiting and filtering it. Despite the existing system of labeling and filtering Internet content, an important and probably more effective means of protection lies in parental control and media education and literacy programs. For instance, in Russia survey results prove this statement. About 68% of respondents over 21 years believe that the decision to forbid watching TV programs should be made by parents themselves; only 22% are ready to rely on the recommendations of TV channels (FOM, 2012).

However, the need to protect children from media content that can be harmful for their health and development remains urgent. Generalizing the experience on self-regulation of audiovisual and multimedia content in different countries, one can identify the following main strategies:

  • developing media literacy and digital competence among children, teach­ers, and parents
  • helping users become acquainted with the safety technologies on the Inter­net
  • giving users an opportunity to be informed about detected dangerous or illegal content (through special hotlines or interactive forms on sites)
  • organizing quick and specific reactions when users relay such information
  • modernizing and stimulating the active use of tools for detecting banned content in the activities of Internet providers and major Internet compa­nies

References

De Prato, G., Sanz, E., & Simon, J. P. (Eds.). (2014). Digital media worlds. The new economy of media. London: Palgrave Macmillan. doi:10.1057/9781137344250

Federal Communications Commission. (2007). Report FCC 07-50. Retrieved from https://apps.fcc.gov/edocs_public/attachmatch/FCC-07-50A1.pdf

Fedotov, M. (Ed.). (2009). Nastolnaja kniga po medijnomu samoregulirovaniju [Handbook on media self-regulation]. Moscow: UNESCO.

Fenton, N. (2010). New media, old news: Journalism and democracy in the digital age. London: Sage, 2010.

FOM. (2012). Zakon o zaschite detej ot vrednoy informacii [Law “On the protection of children from information harmful to their health and development”]. Retrieved from: http://fom.ru/Bezopasnost-i-pravo/10629  

Haraszti, M. (2008). The media self-regulation guidebook. Vienna: Organization for Security and Cooperation in Europe.

INHOPE. (2013). Reports received and processed by region for 2013. Retrieved from http://in-hope.org/Libraries/Infographics/INHOPE-2013-Inforgraphic.sflb.ashx  

Luntz, Maslansky Strategic Research & Hart Research. (2007). TV Watch Survey of Parents Topline. Retrieved from http://www.televisionwatch.org/JunePollResults.pdf

NICAM/Kijkwijzer. (2014). Europese classificaties [European classifications]. Retrieved from http://www.kijkwijzer.nl/europese-classificaties/page25.html

Roskomnadzor. (2013). Koncepciya informacionnoj bezopasnosti detej [The concept of informa­tion security for children]. Retrieved from: http://rkn.gov.ru/mass-communications/p700/p701/

Sparrow, M. K., Bazelon, C., & Jackson, C. (2009). Can Internet gambling be effectively regu­lated? Managing the risks. Retrieved from http://financialservices.house.gov/media/file/hearings/111/sparrow.pdf

TNS Web Index. (2014). Russia 100 k+, January 2014, Monthly Reach, 12-54 years. Retrieved from: http://www.tns-global.ru/services/media/media-audience/internet/information

Tornero, J. M. P., & Varis, T. (2010). Media literacy and new humanism. Moscow: UNESCO In­stitute for Information Technologies in Education, 2010.

Vartanova, E. L. (2006). Samoregulirivanije v informacionnom obschestve [Self-regulation in an information society]. Vestnik Moscovskogo Universiteta. Ser. 10: Zhurnalistika [Moscow University Journalism Bulletin], 3, 8-19.

Vartanova, E. L. (2009). Globalizacia informacionnyh potokov kak factor antoterroristicheskoi deyatelnosty [Globalization of information flows as a factor of antiterrorist activities]. In E. L. Vartanova (Ed.), Zhurnalistika i SMIprotiv terrorizma [Journalism and the mass media against terrorism] (pp. 144-167). Moscow: MediaMir.

To cite this article: Vartanova E.L., Tolokonnikova A.V., Cherevko T.S. (2014). The information security of children: Self-regulatory approaches. Psychology in Russia: State of the Art, 7(3), 136-145.

The journal content is licensed with CC BY-NC “Attribution-NonCommercial” Creative Commons license.

Back to the list