The European Commission is planning to order websites to delete extremist content on their sites within an hour to avoid the risk of being fined.
The regulation would affect Twitter, Facebook and YouTube among others.
The crackdown would lead to the EU abandoning its current approach – where the firms self-police – in favour of explicit rules.
The shake-up comes in the wake of high-profile terror attacks across Europe over the past few years.
Julian King, the EU’s commissioner for security, told the Financial Times that the EU would “take stronger action in order to protect our citizens”.
The BBC has confirmed the details of the report.
In March, the EU’s civil service published details of the current voluntary arrangement, which noted that “terrorist content is most harmful in the first hours of its appearance online”.
At the time, it said there was “significant scope for more effective action”.
The BBC understands the draft regulation is set to be published next month. It would need to be approved by the European Parliament and a majority of EU states before it could be put into action.
UK unveils extremism blocking tool Tax tech giants over extremism – minister How extremists and terror groups hijacked social media – BBC Three
Mr King told the FT that the law would apply to small social media apps as well as the bigger players.
“Platforms have differing capabilities to act against terrorist content and their policies for doing so are not always transparent,” he added.
A study published last month by the not-for-profit Counter Extremism Project said that between March and June, 1,348 videos related to the Islamic State group were uploaded on to YouTube, via 278 separate accounts, garnering more than 163,000 views.
The report said that 24% of the videos had remained online for more than two hours.
The BBC has asked Google, Twitter and Facebook to comment.
Google has previously said that more than half of the videos YouTube removes for containing violent extremism have had fewer than 10 views.
In its latest ‘transparency report’, Twitter says that between July and December 2017, a total of 274,460 accounts were permanently suspended for violations related to the promotion of terrorism. The company says 74% of those accounts were suspended before their first tweet.
If the EU’s proposed regulation is approved, it will be the first time the European Commission has explicitly targeted tech firms’ handling of illegal content.