In a letter to the Ministry of Electronics and Information Technology (Meity) earlier this week, BSA — the software alliance — has requested avoiding the “one-size-fits-all” approach for the proposed deepfake amendments in Information Technology (IT) Rules.
BSA is a global alliance of software companies, including giants like Adobe, Cisco, Microsoft, IBM, and others. It has a presence in over 30 countries.
The alliance has recommended that the proposed policy amendments in the IT Rules should consider the differences in the role and function of intermediaries when prescribing obligations related to the spread of deepfakes.
“This is crucial due to key service-level, technical, functional, and user-based distinctions that ensure that all intermediaries do not have the same ability to address this issue,” reads the letter.
Business Standard has seen a copy of the letter.
The industry body, in its recommendation, argues that distinct services provided by different intermediaries may not pose the same kind of risk.
“For example, business-to-business and enterprise software services pose limited risk to user safety and public order, given the size of their user base and the fact that they do not provide services directly to consumers,” it said.
The alliance has also suggested the government look at ‘content authenticity solutions’ as an approach to tackle the deepfake menace.
It has asked for encouraging the use of watermarks or other disclosure methods for artificial intelligence (AI)-generated content that can help users tell whether the content is real or generated by AI.
“It is important that content credentials or watermarks or metadata not be stripped and are instead preserved by platforms. This will ensure that the public can see it wherever they are consuming online content,” reads the letter.
In December last year, Meity issued an advisory asking social media intermediaries and platforms to ensure that users on their platforms do not violate the content restrictions placed under Rule 3(1)(b) of the IT Rules and also directed firms to educate users about prohibited content.
Rule 3(1)(b) of the IT Rules specifies 11 types of user harms or content prohibited on digital intermediaries.
In case of non-compliance with the existing rules, the ministry had announced coming up with an amendment to the IT Rules that will ensure stricter enforcement around deepfakes and misinformation. Business Standard