Content moderation of online gaming platforms
Introduction
Content moderation is a necessary step in online gaming as there is a floating of inappropriate content that appears while playing the game. Such inappropriate content poses an unsafe environment for young users. Such inappropriate content may include violent, obscene material. Recently centre has taken action to bring online content providers, including online gaming services and advertisement, under the purview of the Ministry of Information and Broadcasting.
In-game content moderation of online games
Many games come with safety settings which allow for parental controls, set time limits, block inappropriate content and determine with whom users can interact and play. It needs to be ensured that online gaming platforms do not host any offensive or inappropriate content and it complies with regulatory guidelines or policies.
Intermediary guidelines
Intermediary guidelines fix the liability of the intermediary who publishes restricted online curated content. The guideline also provides for a grievance mechanism in case content is published by a publisher in contravention of the Code of Ethics provided under the guidelines.
Players-behaviour
Players’ behaviour plays an important role in a safe and healthy gaming environment. However, on online games, when game players play the game with unknown users there have been cases where unknown players use foul and abusive language. There have been cases of cyberbullying, malicious links on chat box features of games, etc. Such challenges or threats on online gaming platforms need to be regulated by government policies. Players must adhere to the platform’s rules & guidelines and terms of service. Encouraging good behaviour among the online gaming community is necessary for a better gaming environment.
Chat moderation
Online mobile games offer chat-enabled features which offer interactions with players. However, on chat, there have been instances of gamers harassed, abused, cyberbullied and demotivated by anonymous users. Hence there is a need for chat moderation which can improve the safe and healthy gaming environment for the users.
Live streaming of online games
Live streaming allows the creator to live stream their gameplay and engage a live audience. Such broadcasting of real-time gaming needs to be regulated because there have been incidents where foul or abusive language is used on live streaming. Platforms provide community guidelines which prohibit such behaviour. All the creators, gaming influencers and game players must follow the community guidelines and must adhere to the platform-centric guidelines for audio-visual content.
Prohibition of CSAM content
CSAM(Child sexual abuse material) content is prohibited by the regulatory policies and laws; however, there have been instances where a child is playing online gaming and encounters age-inappropriate content. The gaming platforms must technically deploy techniques to ensure that they do not host CSAM or inappropriate content and block such content from appearing on screen.
Conclusion
The gaming platforms must have the responsibility to regulate content in compliance with government guidelines. The gaming sector is booming in India; hence best practices should be encouraged to establish a safer and more responsible online gaming environment for all users, especially children. There should be no inappropriate or illicit content on the gaming platforms. The gaming entities, players and stakeholders must advocate for a safe online gaming environment.
References:
https://www.conectys.com/moderation-services/in-game-content-moderation/
Author: Neeraj Soni, Intern – Policy & Advocacy team, CyberPeace
FEEL FREE TO DROP US A LINE.