/* FORCE THE MAIN CONTENT ROW TO CONTAIN SIDEBAR HEIGHT */ #content-wrapper, .content-inner, .main-content, #main-wrapper { overflow: auto !important; display: block !important; width: 100%; } /* FIX SIDEBAR OVERFLOW + FLOAT ISSUES */ #sidebar, .sidebar, #sidebar-wrapper, .sidebar-container { float: right !important; clear: none !important; position: relative !important; overflow: visible !important; } /* ENSURE FOOTER ALWAYS DROPS BELOW EVERYTHING */ #footer-wrapper, footer { clear: both !important; margin-top: 30px !important; position: relative; z-index: 5; }

India issues notice to xAI Grok

"The safety work is never done, and the more powerful these models become, the more we have to think about societal impact." - Sam...

"The safety work is never done, and the more powerful these models become, the more we have to think about societal impact." - Sam Altman, CEO, OpenAI

Government issues warning

The Union government cautioned X's AI app, Grok, against generating sexually explicit content. The Ministry for Electronics and Information Technology issued a directive to the firm's Chief Compliance Officer regarding these safety concerns. It specifically warned against the promotion of nudity or any unlawful material through its digital platforms.

Concerns over women's safety

This followed a letter from MP Priyanka Chaturvedi about the abuse of Grok to target women with fake imagery. She urged the government to ensure X builds robust safeguards to protect users and maintain a safe digital space. The letter highlighted how ai tools are being misused to violate individual privacy.

Check our posts on AI safety; click here

Review of safety guardrails

The ministry ordered a review of Grok's technical frameworks, specifically prompt processing and image handling. It seeks auditable compliance to prevent the generation of obscene, vulgar, or sexually explicit responses. The firm must ensure its large language models are equipped with safety guardrails to avoid creating harmful content.

Legal consequences and compliance

X must submit a report on safety measures soon. Failure to comply with the IT Rules 2021 could lead to penal consequences and the loss of legal immunity under Section 79. This legal shield normally protects intermediaries from being held responsible for content posted by their users on the platform.

Addressing digital privacy violations

Invading privacy violates statutes like the Bharatiya Nagarik Suraksha Sanhita (BNSS). The government warns that ai tools must not bypass laws, requiring strict adherence to mandatory reporting rules for generated content. This move aims to ensure that technological advancements do not compromise the fundamental rights and safety of citizens.

Summary

The government warned X over its AI tool, Grok, for generating explicit content. The ministry demands a safety review and compliance with the IT Act. Failure to act could result in X losing its legal immunity as an intermediary.

Food for thought

Can any AI platform ever be truly safe if its guardrails can be bypassed by creative but malicious prompts?

Check our posts on AI safety; click here

AI concept to learn: Safety guardrails

Safety guardrails are technical constraints in AI models that prevent harmful content. They check prompts against ethical guidelines. These systems are essential for maintaining safety and privacy in generative tools.

India notice to xAI

[The Billion Hopes Research Team shares the latest AI updates for learning and awareness. Various sources are used. All copyrights acknowledged. This is not a professional, financial, personal or medical advice. Please consult domain experts before making decisions. Feedback welcome!]

COMMENTS

Loaded All Posts Not found any posts VIEW ALL READ MORE Reply Cancel reply Delete By Home PAGES POSTS View All RECOMMENDED FOR YOU LABEL ARCHIVE SEARCH ALL POSTS Not found any post match with your request Back Home Sunday Monday Tuesday Wednesday Thursday Friday Saturday Sun Mon Tue Wed Thu Fri Sat January February March April May June July August September October November December Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec just now 1 minute ago $$1$$ minutes ago 1 hour ago $$1$$ hours ago Yesterday $$1$$ days ago $$1$$ weeks ago more than 5 weeks ago Followers Follow THIS PREMIUM CONTENT IS LOCKED STEP 1: Share to a social network STEP 2: Click the link on your social network Copy All Code Select All Code All codes were copied to your clipboard Can not copy the codes / texts, please press [CTRL]+[C] (or CMD+C with Mac) to copy Table of Content